var/home/core/zuul-output/0000755000175000017500000000000015157011105014521 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015157020313015466 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000275737615157020137020302 0ustar corecore_ ikubelet.lognc9r~DYd` \-Hږ%C{sg5݁ϱ(Ӄis$WU)X6*\(\"mv?_eGbuu񯷑7+%f?7ݭ7֫k}% oo/q3m^]/o?8.7oW}ʋghewx/mX,ojŻ ^Tb3b#׳:}=p7뼝ca㑔`e0I1Q!&ѱ[/o^{W-{t3_U|6 x)K#/5ΌR"ggóisR)N %emOQ/Ϋ_oa0vs68/Jʢ ܚʂ9ss3+aô٥J}{37FEbп3 FKX1QRQlrTvb)E,s)Wɀ;$#LcdHM%vz_. o~I|g\W#NqɌDSd1d9nT#Abn q1J# !8,$RNI? j!bE"o j/o\E`r"hA ós yi\_.!=A(%Ud,QwC}F][UVYE NQGn0Ƞɻ>.ww}(o./WY<͉#5O H 'wo6C9yg|O~ €' S[q?,!yq%a:y<\tunL h%$Ǥ].v y[W_` \r/Ɛ%aޗ' B.-^ mQYd'xP2ewEڊL|^ͣrZg7n͐AG%ʷr<>; 2W>h?y|(G>ClsXT(VIx$(J:&~CQpkۗgVKx*lJ3o|s`<՛=JPBUGߩnX#;4ٻO2{Fݫr~AreFj?wQC9yO|$UvވkZoIfzC|]|[>ӸUKҳt17ä$ ֈm maUNvS_$qrMY QOΨN!㞊;4U^Z/ QB?q3En.اeI"X#gZ+Xk?povR]8~깮$b@n3xh!|t{: CºC{ 8Ѿm[ ~z/9آs;DPsif39HoN λC?; H^-¸oZ( +"@@%'0MtWG uIo1]ߔr TGGJ\ C.iTEZ{(¥:-³xlՐ0A_Fݗw)(c>/1:N3cl.:f 3 JJ5Z|&הԟ,Tصp&NI%`t3Vi=Ob㸵2*3d*mQ%"h+ "f "D(~~moH|E3*46$Ag4aX)Ǜƾ9U Ӆ^}ڲ7J9@ kV%g>a~W;D=;y|AAY'"葋_d$Ə{(he NSfX1982TH#D֪v3l"<, { Tms'oI&'Adp]{1DL^5"Ϧޙ`F}W5XDV7V5EE9esYYfiMOV i/ f>3VQ 7,oTW⇊AqO:rƭĘ DuZ^ To3dEN/} fI+?|Uz5SUZa{P,97óI,Q{eNFV+(hʺb ״ʻʞX6ýcsT z`q 0C?41- _n^ylSO2|'P'BOTLl-9Ja [$3BV2DC4l!TO C*Mrii1f5 JA *#jv߿Imy%u LOL8c3ilLJ!Ip,2(( *%KGj   %*e5-wFp"a~fzqu6tY,d,`!qIv꜒"T[1!I!NwL}\|}.b3oXR\(L _nJBR_v'5n]FhNU˿oۂ6C9C7sn,kje*;iΓA7,Q)-,=1A sK|ۜLɽy]ʸEO<-YEqKzϢ \{>dDLF amKGm+`VLJsC>?5rk{-3Ss`y_C}Q v,{*)ߎ% qƦat:D=uNvdߋ{Ny[$ {ɴ6hOI']dC5`t9:GO: FmlN*:g^;T^B0$B%C6Θ%|5u=kkN2{'FEc* A>{avdt)8|mg定TN7,TEXt+`F PsSMiI S/jﴍ8wPVC P2EU:F4!ʢlQHZ9E CBU)Y(S8)c yO[E}Lc&ld\{ELO3芷AgX*;RgXGdCgX JgX2*Ъ3:O7ǭ3ږA :}d,ZByXϯ&Ksg3["66hŢFD&iQCFd4%h= z{tKmdߟ9i {A.:Mw~^`X\u6|6rcIF3b9O:j 2IN…D% YCUI}~;XI썋Fqil><UKkZ{iqi :íy˧FR1u)X9 f΁U ~5batx|ELU:T'Tស[G*ݧ ؽZK̡O6rLmȰ (T$ n#b@hpj:˾ojs)M/8`$:) X+ҧSaۥzw}^P1J%+P:Dsƫ%z; +g 0հc0E) 3jƯ?e|miȄ?lm$K/$s_. WM]̍"W%`lO2-ew@fϓ{Bb$B|W XWz=؅ <%fpG"m%6PGEH^*JLJ)oEv[Ң߃xQrMI>QQ!'7h,sF\jzP\7:_Q\)#s{pH*eZ1j[h0SH,qf<"${/ksBK}xnwDb%M6:K<~̓9*u᛹Q{FЖt~6S#G1(zr6<ߜ!?U\(0EmG4 4c~J~]ps/9܎ms4gZY-07`-Id,9õ԰t+?}-m6Mw'yR3q㕐)HW'X1BEb $xd(21i)ϸ_і/Cޮm0VKz>I; >d[5Z=4>5!!T@[4 1.x XF`,?Hh]b-#3J( &uz u8.00-(9ŽYcX Jٯʞ钋Kaa.qr&G/Xp9VqNo}#ƓOފgv[r*hy| IϭR-$$m!-W'wTi:4F5^z3[{1LK[2nM|[<\t=3^qOp4y}|B}yu}뚬"P.ԘBn방u<#< A Q(j%e1!'kqiP(-ʢ-b7$66|*f\#ߍppHKZ{{Qo}i¿Xc\]e1e,5`te.5Hhao<[50wMUF􀍠PV3ib.63>NKnJۦ^4*rBWukMmQ9Blm]TO1ba.XW x6ܠ9[v35H;-]Um4mMrW-k#~fؤϋu_j*Vj^qM `-P?k.@%=X#|ۡb1lKcj$bKv[~"^=զ,i:xI_ˌvg&*5F>#q o* CƂ lu" yo6"3껝4cENKyaP-@i'/!k/m~ޖ(\d!n&.TU n$%rIwRbͮW`A2 :ePmqs;l};Щ۸l?28Ćn.k &K<-na'Xk,P4+`Þ/lX/bjoFOwppW@<<ȑ3n߿z,t s-Z/ Clo-` z?aB mzkC+P}b&x Uhm._O 4m6^/yUhr֕o/n@CB@>!˾ lg}릅h:@61Xv`Wk}ˍ[`-fYX_pL o+1wu(#'3"fxsuҮױdy.0]?ݽb+ uV4}rdM$ѢIA$>A ۥ͟յtyEXw>h+@]*pp桸]%nĴBԨlu |VXnq#r:kgP1,MNi˰ 7#`VCpᇽmpM+tWuk0 ?\34 P U!7 _* kTuwmUr%ԀjƮĀdM#^ۈӕ3NeBO`^*?\3Թ-hrxA݂\EjW5nGppH? 8>X+m7_Z`V j[ s3nϏT=1:T ?<= pDCm3-b _F(/f<8sl, 0۬Z"X.~b٦G3TE.֣eմi<~ik[m9뀥!cNIl8y$~\T B "2j*ҕ;ێIs ɛqQQKY`\ +\0(FęRQ hN œ@埝n|Vo|6 8~J[,o%l%!%tyNL`'žeVVޖ~;BLv[nufӠ1˟udyU)? I?g\һS8]uBi( Ql{]UcLxٻa,2r(#'CDd2݄kTxn@v7^58þ Ţ/!3M. x!0=k$}  D&T+̔6vmEl 05 D"wO>"J~=`t/kt-MSTfH,*S@gU2:'7^a2hҫfeMuVgnc.^xt4gD638L"!}LpInTeD_1,֒J2)p|ݛwu0{֩2ْM4tޖӳM\Qe%*?vQ~W  yr3-2+=Щp!k2 ûQÝBsN&4wx,k%*nD4qL~`|%4Q0q["< HK'f dt(d/ZoQ%o}~Yki7}SWekk̗E\e'NhEY9[Nj?7:0@Iuʙ?&Ԕ8e,žDG"1lͧQѶGM]}yxZl 0JM"dλ=`Yƚ^"gJT>u8.H#',c@V8 iRX &4ڻ8zݽ.7jhtsϑ`94t1!F PI;gi`.W&&ף \98 y{͝Bx#)u60ݽ[X-w &ld)r;#Q'˄2Hi{(2HFE?*w*hy4ޙM^٫wF(p]EwQzr*! 5F XrO7E[!gJ^.a&HߣaaQÝ$џwyz4}0!gP_nIzMsPIȰքbFEűEK . ˇj".BBX9%5{M Q 1yV|[jfLs FN*t*oӐ57\Vlhl'jZp1~#aY뼧iùth'*sI~'x#bv#՗*1֌IN}YN+VqozmyY4Ӳ2S<(j%\IGλ`ʌ=]ח?Ώ2J0BLzv8D<%P\MUfN-0]"DBQlt #;NMBQ&{&x6 ݲc|!t(ڀ  lˣaVۤkĨdԖ)RtS2 "E I"{;ōCb{yex&Td3>@)p$`Xs@7,?ע|tC3.㤣TiHEIǢƅaeGwHrM dS^Zcb2(߳T2FM?kfRSጷQ_Map@OTi& J1'%JV@ڝy?ʕ+9M+CWp SZEY|RyZc]/mm}àpGg.S[@AeE{0մ{b\No*{:Mzw =lQυo,V.|K,ϩ3g)D͵Q5PBj(h<[rqTɈjM-y͢FY~p_~O5-֠kDNTͷItI1mk"@$AǏ}%S5<`d+0o,AրcbvJ2O`gA2Ȏp@Z#"U4Xk1G;7#m eji'ĒGIqB//(O &1I;svHd=mJW~ړUCOīpAiB^MP=MQ`=JB!"]b6Ƞi]ItЀ'Vf:yo=K˞r:( n72-˒#K9T\aVܩO "^OF1%e"xm뻱~0GBeFO0ޑ]w(zM6j\v00ׅYɓHڦd%NzT@gID!EL2$%Ӧ{(gL pWkn\SDKIIKWi^9)N?[tLjV}}O͌:&c!JC{J` nKlȉW$)YLE%I:/8)*H|]}\E$V*#(G;3U-;q7KǰfξC?ke`M$Q ?I@> kV0/ŜxtADx"Xh4|;XSxߵă@pE:y]/"(MCG`ʶϊGi+39#gNZYE:Qw9muB`9`LDhs4Ǩ9S`EkM{zB<˙ik; JD;;3!4 2Y.$Dwiu|+lO:k$]ԜYLUҞ6EmH>azʳ/A+ԀZk"f`.,ל{=wh|_qYj5M{K$gv>cDp"'0޽5xCNQ1G2})*'>fC۝'*)"5.E2IeD 2.ZdrN6Uœ=n8D-9޵JKw5ُJ,􋃓ZUꋼ0b1f87GՂ 1t_o}{Mr7KO0Ao-Y*Is\S:JzA(:i!eҎ\,f+,Ąt78~ڋ~?[F^.A'!,iGow3{'YToҝf5ޓ[he>=7S8DGZ@-#]f:Tm?L{F-8G#%.fM8Y='gیl0HڜHLK'Cw#)krWIk<1څ 9abHl:b3LjOq͂Ӥ=u8#E2;|z꽐vɀi^lUt␚ɓW%OVc8|*yI0U=nFGA`IC8p+C:!}Nh,mn>_MGiq'N~|z`|mu}r:"KiyGҪ$& hw#4qn?ܶХfm_Ov^ܶ[6j3ZN9t9ZMMM)I[Rχ/C|W䳮yI3MڼH9iEG&V 'x`u.̀ab7V<*EzfH{]:*6M x-v쳎M'.hO3p-IGh ܆hR ]zi2hB9'S_;I/d0oIU:m/~[*K1QA="D:V&f:{7N>^uU` c/X)mS5KC߄":{H)"%,!3w{"ZWÂk>/F?RJ>FIY*%5Hg}3Ď89؟N/pgÞ tJXB-Gjsٶ 3Gzp؍H|*cyp@\첹,[up`uV,\KCB\qGiW痃[?i?S{eϻl71X:݌>EEly(*SHN:ӫOq{{L$?Q{϶(F_Ej>3mqfΤP-j)H˧&8?a?2xĐ+EV؍x0bv6 fd1^ 2ӎԥ sZR cgu/bn/34'h9Dݥ:U:vV[ 'Mȥ@ەX㧿-p0?Q6 y2XN2_h~Cֆ֙82)=Ȓ7D- V)T? O/VFeUk'7KIT, WeՔ}-66V؅ʹ;T$pZ#@L; ?0]"2v[hׂ'cJ6H4bs+3(@z$.K!#Šj2ݢxK-di +9Hᇷ絻+ O.i2.I+69EVyw8//|~<ëng)P<xͯ~? fp,CǴ_BjDN^5)s('cBh+6ez0)_~zJz"ё`Z&Z![0rGBK 5G~<:H~W>;ٍVnSt%_!BZMMeccBҎÒJH+"ūyR}X~juPp- j\hЪQxchKaS,xS"cV8i8'-sOKB<չw"|{/MC8&%Og3E#O%`N)p#4YUh^ ɨڻ#Ch@(R &Z+<3ݰb/St=&yo|BL,1+t C<ˉvRfQ*e"T:*Dᰤ*~IClz^F6!ܠqK3%$E)~?wy,u'u() C>Gn} t]2_}!1NodI_Bǂ/^8\3m!'(Ֆ5Q&xo 8;'Jbo&XL_ʣ^^"Lq2E3,v1ɢu^}G7Z/qC^'+HDy=\]?d|9i,p?߼=\Ce"|Rݷ Q+=zxB.^Bld.HSntºB4~4]%.i|҂"? ~#ݤ[tfv3Ytck0O ͧ gP\|bЯ݃5H+v}/{&Ά+4*Iqt~L4Ykja?BH6!=?8[Y|-ɬeǪzd;-s~CM>e:9[_v~\:P ؇'k01Q1jlX)/ΏL+NhBUx~Ga>Z"Q_wjTLRˀtL L+BT҂ll魳cf[L̎`;rK+S- (J[(6 b F? ZvƂcW+dˍ-m𢛲@ms~}3ɱ© R$ T5%:zZ甎܋)`ŰJ38!;NfHohVbK :S50exU}W`upHЍE_fNTU*q%bq@/5q0);F74~'*z[\M-~#aSmMÉB2Nnʇ)bAg`u2t"8U [tJYSk, "vu\h1Yhl~[mhm+F(g 6+YtHgd/}7m]Q!Mę5bR!JbV>&w6οH+NL$]p>8UU>Ѫg39Yg>OF9V?SAT~:gGt $*}aQ.Zi~%K\rfm$%ɪq(%W>*Hg>KStE)KS1z2"h%^NEN?  hxnd/)O{,:خcX1nIaJ/t4J\bƀWc-d4M^d/ ʂK0`v%"s#PCoT/*,:[4b=]N&, ,B82^WK9EHLPm))2.9ȱ  QAcBC-|$M\^B!`}M^t+C~Lb }D>{N{Vt)tpDN,FCz~$)*417l;V iэ(_,j]$9O+/Sh]ice wy\Mڗ$,DJ|lj*à␻,?XAe0bX@ h0[}BU0v']#Vo !ې: Z%ƶ(fl>'"Bg< 0^_d0Y@2!ӸfZ{Ibi/^cygwדzY'Ź$:fr;)ٔf ՠ3Kcxwg*EQU{$Sڸ3x~ 5clgSAW"X Pҿ.ظwyV}̒KX9U1>V..W%GX +Uvzg=npu{do#Vb4ra\sNC/T"*!k愨}plm@+@gSUX覽t01:)6kSL9Ug6rEr(3{ xRP8_S( $?uk| ]bP\vۗ晋cgLz2r~MMp!~~h?ljUc>rw}xxݸǻ*Wu{}M?\GSߋ2ꮺ5w"7U0)lۨB0ח*zW߬V}Z۫ܨJ<]B=\>V7¯8nq~q?A-?T_qOq?5-3 |q|w.dަ'/Y?> (<2y. ">8YAC| w&5fɹ(ȊVã50z)la.~LlQx[b&Pĥx BjIKn"@+z'}ũrDks^F\`%Di5~cZ*sXLqQ$q6v+jRcepO}[ s\VF5vROq%mX-RÈlб 6jf/AfN vRPػ.6<'"6dv .z{I>|&ׇ4Ăw4 [P{]"}r1殲)ߚA 2J1SGpw>ٕQѱ vb;pV ^WO+į1tq61W vzZ U'=҅}rZ:T#\_:ď);KX!LHuQ (6c94Ce|u$4a?"1] `Wa+m𢛲`Rs _I@U8jxɕͽf3[Pg%,IR Ř`QbmүcH&CLlvLҼé1ivGgJ+u7Τ!ljK1SpHR>:YF2cU(77eGG\ m#Tvmە8[,)4\\=V~?C~>_) cxF;;Ds'n [&8NJP5H2Զj{RC>he:ա+e/.I0\lWoӊĭYcxN^SPiMrFI_"*l§,̀+ å} .[c&SX( ( =X?D5ۙ@m cEpR?H0F>v6A*:W?*nzfw*B#d[se$U>tLNÔ+XX߇`cu0:U[tp^}{>H4z 4 (DtH-ʐ?sk7iIbΏ%T}v}e{aBs˞L=ilNeb]nltwfCEI"*S k`u ygz[~S [j3+sE.,uDΡ1R:Vݐ/CBc˾] shGՙf 2+);W{@dlG)%عF&4D&u.Im9c$A$Dfj-ء^6&#OȯTgرBӆI t[ 5)l>MR2ǂv JpU1cJpրj&*ߗEЍ0U#X) bpNVYSD1౱UR}UR,:lơ2<8"˓MlA2 KvP8 I7D Oj>;V|a|`U>D*KS;|:xI/ió21׭ȦS!e^t+28b$d:z4 .}gRcƈ^ʮC^0l[hl"য*6 ny!HQ=GOf"8vAq&*țTOWse~ (5TX%/8vS:w}[ą qf2Lυi lm/+QD4t.P*2V J`\g2%tJ4vX[7g"z{1|\*& >Vv:V^S7{{u%[^g=pn]Y#&ߓTί_z7e&ӃCx;xLh+NOEp";SB/eWٹ`64F 2AhF{Ɩ;>87DǍ-~e;\26Lة:*mUAN=VޮL> jwB}ѹ .MVfz0Ïd0l?7- }|>TT%9d-9UK=&l&~g&i"L{vrQۻou}q}hn+.{pWEqws]]|/ǫ\}/J.MLmc ԗWrU}/Ǜ+sYn[ﯾeywyY]]¨Kpx c./mo;ߟRy*4݀wm&8֨Or4 &+Bs=8'kP 3 |}44S8UXi;f;VE7e4AdX-fS烠1Uܦ$lznlq"җ^s RTn|RKm;ԻZ3)`S!9| ?}m*2@"G{yZ${˪A6yq>Elq*E< NX9@: Ih~|Y4sopp|v1f2춓t$ėcj{ƕ_1~lQdMw{Ӵ].(7Ջ쯿CIv4 ulyf8ԌLQQ6Ryvɖ9Tdw.O1P$_.wZC5Mj hP ,xGE*yVڡ8TCdC"o{<5QSV漹ӕ0Oܞ~C9T8D\@$S\}\Ô>sW5:/P=2l{%e+AKVܟ g/M.b3"^\!"xrp},|&n(×%U/!y#2A%%.yYR ;qK  }AǁI%\ߨӚsq-n{ТXE^j, AW"tR.gstU+rcZ&@<'mLnd;^uױLd..sdw HǕ/&GQ$Կ.R橞 %yIv~ !9Ƕ[b(m,/En[܇`AH~wf ^Le[K#o%&cKRȐy%Ocv ۹p ;م]Pu`Tͣ:e}ЊҥX`J |޴2>2='p0@ 7|>^'K\7OGfTW*iBuP?$GlJHK1_Q pӅwj }GE27$ cM*W5dhc1R.u񳵣bCKuZqSa ^IIg]u9/xʓzKG4 ,"_(i/!J܉x:,M`]){z@asS/],%߰s3r08JGx鿅ͣKw?<-VkZ}>i+&;sy7oOQ n5Ƹ OvTj ͹ DtAj7035Cw=>N?V7 . }ⴳM)YU4}7|"v'5_) ػOn+E>R/0<-}Yqv's2=h`RE'Gp$`5(J{-nP(\?p^E%ܝi6+eU{wwnͭ qNߞ-!"Vټ>eg;^sH.KW%GqlnRWyy< 1C'(*;\":y,{n£xtœ. e8!։Ӈ3:`SM<'s).o=7,"!=!7NpzJMRB59cT`}?pvoR14\~892qT|J@/R_ㇷ_d(R>ڰ8=GٯHHg*g?NDogy~T6ï. ?6Q&klb}LpȿϦKݎ>1~郾gTP6GRgiCd?65urƄHXًiw>6s_-\\,&9"KYf]-M/^WRH~kkԬIj2ÛGX q#oBJVU<"S 7t Y,Go7"3֢YZB?bj#b#uy T=:`кz^6sʿ D  qt|(kK]IQb^ar/ugqo]G%؂JUïo%YKpl6ObbVU zRᰩ[h5e % 6P+X1A \Y: Mh,[+aJU6hiZtYU+Q -r;+ZhX80s vmt^Fٗ~:Zxp~>#]@乸::>S!=- #چ‘ ,3}eԼ6V,_C#[Qef09uбuf ?lr: 0K! mG7dt.<\:j X SСU o/Q,ޤ1?x~jXNt(LbIY k["Ee[0_\-R7gefBױv`ev3 3kU[ .r#bYp}#nziKͲk B&xNC%\}hInX]ҺE.3uI>LF)eKVRʿA ̧f󦒗R:Sچ< <ǀ?4k UF1EgLkۊTߞ'lO\ߜQ`.bt6t,F<3>lX5to0iMV[뼱6ۙз63alns<>8ͯZJBϷ)Fk@\I!t6jGU6 LY<ިf_Q;Pmy Yp UQ2$Ӕs.}Tx0 QQ2pR\g /P<'4:LbV nhfWl6|3q@ot5h^\.dy{{zՇi 7 rO]STU7dU tFjTឞ),Ґ0"V` +jc]ʛAj6yf'ʨKw2W"WiIgLjKg t\'c™>mDqV0[*DD+р{RS=DUef/ h_htoʮУ{T~_ i{RK7)!׷ZHCDoORDQJ=d@I'J]'(ETD7ڧ?P+^JSGOiIxJOR')Ct6(e4tQL{!&޹Ĵ/-) =eu䙍Z-Z32 5#SpOL3dN>ă٨>|j$%0ioº~рs{ ~ Bs֤_GB a)>qH_4kAqZ@uK4ʒo5z5CO߹I\cA(irQWc`}Ծy2$tGsoUV5b (y+(~DQ&WQ:LĪ(~Oh:iuM>-ت24J_fBU]toMIgW2X~,]R:I):oUS-lu:+FQYUK !#p31&A,ypiO'(QScD~ :T?yqߗWϮWg_k`LnDJc{NԥEl* )`8t2`(`qhN D}.u,)W@qkehبP4RA7ŵ:,=%4niR)V.ulu#J Һ8VKJpt5]u] `/n?bťqπDcikQ@4ַb)D'≧qzj 04Zh()OU@Gw:tm1?fq,mju"Ā g2(1qti֞%Xmpp>;A `CGô y8ƚj@,ƻp 0յ/ӎK'`po+˨.`x>JCѪC+@0Q=W0:4ղ˘F^Vm6xmڍNaO04|EWZ~h' \-(בlu.L>;A!`:Q3z-qC^NxW0qGϣ]{c;<>l52'|~zA%>VgY:*[1_c-zS=l[,P  k>L''eQ,*A\QMCx^l\'&-;VLn9A:a-S3|#1] 3]ȼ0hX9e87v.\P<ꔖcK8?3K~i0mL~gZrw Ll%m; qN3sBC{.+4$)%کtSѳ 犵\#`$s2wsK,Վ2ߕDaIj&nktn۬{ʰnckw$HPn]K⎆CvhbkŊC٦ۯ@Ej2)RBzĭ).'.3#w,w7#ˆ5>N#:"/ ʼn1ɇdv+cd.Œx1 p>*lЯXl2Heu kZd6aY|pfQ~{vxxe?jaBYBRfdÂ.&H 0+QR<KMJpqnd:@&,>idXcbuKV7eo0 MӱӀnifo!ӆ!TO ^y"ȪK`culir 3fn͑ApـR ^hZ{%0Vcqٓ^6?G=󐖈tw1#oRAh IM& 1RJfAF"lZ>3a;l:tm㍲2ӥ˲FdO {b=KT"IϚ/8 ؝4i)2tH&PwKJ0s+ ;"􁜝s!V?HĨאBaOgq&th]'\4j#`SB,[JD6]{J v! z6U[6I9QlHEf$:{: 8LeUq|8@%Rn0ߠ*[TO>~<;J$̘i 3 >Lrq>7yNWpgo/tHnA^i~^bA7]C-wbn& P2:зPQU*#^TcUϞ }AaOЖ ?Eeu5hZ>[o l^+n)tQg1{7ؖfE3-.毳z^M X̲u`p |$Xw\=]((e@pC$Xm3\DS60TyPiBdњlV1>wm:K9$`u,IT7(+$A[p81!OC' ='XW5"kQ?  B akڋga;໴jҵfQoA}=mic%pdOӍO37aOOs-OnX5=60|d7ΝxT&Bؓ}ߗf%)ϑ_q$}`g!pxv4Ҿn%誑wpO7`[Os">mzB l,Vb.ٛ:qUُ<&U9>NbcΦ9<(% &q_ȫ"}7eB J$BJ]ѩ^v+ݩ_W;vݧk۽;weԲe#I)ہPL(߁P;0B w ޝP0B= w Y&فPgwBu$فPwPwB u#ݓPwBeBv';PoOB w ߝP0B= w 4X&4؁`wB $4@.qlax:Ԯ!PI0!w U-_:QGNLQ6B 0R#mV>ܓRiOnH{a FeZ$*RֻcR%5DVo?gTvzYͼcX/7v޸6&%E$-ޗu+ޟyLLl2*2﯉M^`a9yàh0ȟ͕Hrt^!gHX(P)S ҟyB%ll剁<}ɿ8'~~,?/,GCm`) j 4 E`tkASGuۂ4DU#)i,JY!B .Y w h["H>=gdC?AE~;x7Bp/ʍLTVR+GYc^)= }֖ -qkiAXHH*U -/"妔4¥ 4E_exkj9 *FqVp e"[Y"zŭˏP?^'üSIW0gLDcY#Gx:etz"nqǺX(b FSĝ .a}3D2OU*35ݷiY~&ph4{@P hz.cA?Q,E6JQ%P!L[jG ]"F`9BVp0͆/ ʈ`"U4A@wދ0P+'2L"X/;K5\^ W MO_L* wғk.Y(O.|3Xl+p\4Vk׈]e+Y%Y js+gUs徺EETFݲraI!SjSA*/^ZprdN6%JSlzBUqZ'8T:ˊ|@! ӉqO,ߑ5T9N8p$|W@lbmP%¥V}:&G(f;#5aቾAsʋHiaXCφczo!%{[Ƨ97:Q`3QSwTؐZqҘ+щ+B%:wQ^=nY%85pe!y!ꪎ~i%fd Ae Q( glq#6:?r#z:S?WmvЏpRVj'8j **Z".BOpfYUιi:Gpƾ%+{jl-% ¸Ѡ6L$}UCU&Ѳ#Xv6%]kaHH}T!zV6mu}>n}/ȤӉsWqU\l9~LٮI>-mUUuմwu@0nQOrV磑Se@aG gWXx~,@gWx(phɦNS,$s! Z5IyJPԤXByXN≮lPQL/)_LI̺z{-ö@~ ke\Em mt<49>4&vaOHK:M0iPL k#hwqIab{}<[L}#5lrOQ4 P%#30LSwgo2zYt~{M߄kR`Sǔ=otg46+xS{|QPu`p} c6~hM|ZAd{l ۾}T~]~ߒ'W/tO]M|lu%'>6lGEOP^Re$rpÿjTߦL"GMb3} /m=&g }ޣI~ş_(IO4O_,~!xC:{d.p b޽ڻmzT~{31 FkBc/-4Q8, eZ)>]+&2 90+>hcZdrCW7Qԡ Л %YvEPYb̗c4j R$z hZYSƚ 8}URFIЈWNu#38ܵIЧ`xū 2F{ЌҴ"~N]pjQ8A N%eCS&e0DP>Fr }Z5 B!Sg!큥ebQ]I1%wd.VͣJC̒%f)BRv]p\Xoxkd/-nP3z8G7L_Z;%HZ-?那djv;(_62bb\iV#TQ(4V\/Td AyF..q *X,Δ2%B.Ƴ7h4)x0}TVH4"oF"ZyjʾMh,3|JPr1Mo:PBF5^>)FX54L[5+mPYI>Я鏝oy&v. #ĉ4 Yi-fȭ1@ #fEۿ.Gׇ9,h`:ju(7gn NzsdB=la W%nLcF5r8 @_Wcn8aǃ@Tɲv!QowA g5s4sJ>6m'Jl{YZ9A4ZH|;ӋoGGs\Xt[nd[V伐]@5xR;&q|_%m's󏧪^wg3?, e̼ЃD/QU(3gC`N1"zwOZ oӵ킣e{&UAɳ6`x!N JιD"L>:ޛR,ZNQN!T ;Cm큣D;Sa9SbC8kZR$S=.n-UzΙ ϖ4mOmvۦoah=mW h"BS`1N=Ҕ%@siW: M%)}{t?;^'sN IUdbTdq  /jwַ.?/ ([YXiqhƺj% َV &pxM )~aYw !ʜu8hvqvK^egZ)]уQ^ W@y ҆w J jAҙ.^G554NuT@grvh끠ac(+jl9- }9 ,@tv zTQq' "JhoL=^j >ʂEk`Wg1ltM"WаVp[.J ]!)TS.R41֭n鐚y$T%PQ)h.mR+#ER^kZcC\gZ2c"o駆/ ?~ptŢ\N8\Ϊ5.6Eik3i'A͍K, ILP$iR.]9OX65+1[jJ@\r[0킢$7gsiW6*F Φ,8<`-ڱuA̽I6%'#Q ~%h:ɥC'ICϳEG9~Jedߦ'oF3{ii^?W.hbKifqګ<f4`º~CYYYn2s#VO9Ҵ\MvQ \R|0^PZ>97LDZc7׮ /iVo02~Z>m!r۵L/Ե(80OCCE_8&%-HST_t+&6  U0%7E88xYf2A8|8δ.`|49#r=clVOTg^=kglR|kFϊ fA`pov#|ޮ vʿ>!O@ h֗2GRB2J7|t՛^9eqgM ƺм3l?Z Z1?#N$ (;o.phI:?8)ZPϘ핿tQ-kw|-afiYVv3~YWۚs3%̨N$MiLXӬVGe:ߪc2c*%>VD:υY68zi,<¯um_J3#Sy~/꺴4#sV]pcqV&B<1>XŏѠU|?E\c=^:s䆣 -ܘEmuQHts-_X2\lE.Pb[8Ws,"|9Zo8xRf"9kۏ]nB1z~d 1u׈=k3à Oy0c Q)}tKB-њ[LB7L)?GF c45l,t &D(oI[Q>F% Ϛ LUV>vqpG~)!]:(eNHE ^uiKВ8!EN=/ȩ_uqbҢщTzj{/%R>u{/eH6֫~wWaehˬ*D 2K!"҂-ʋ9ᠸ0 Λ7!ڑRǥX~Ctn]ptj~h[eBa>+Mr;VҙY6izuy~ͥOdPVu`riXPF|.kF@!:Ŷ~ݥx{AV7~wbtE2} ?nRSN^l˟@*?W9];-,y%n oyop$_ݘPޕ6ndۿB k_ AiLHO2XMDRr3Hɢl9ZUuyֹ*T:y rjtU *_N,{k힗Lu6=M`z8*G : \䠻R9nP訟b0F!"XydGlH#RZ͊wRn9Pwk4v)*ҏGa'8D}nxq$QH㗼i)QU.H[&z߳\ꠎ2x~dTSˠX[o_nw# =ZE ݟNt +kV\lyLJ_ƆG 4Nq?fټji30WU_X;Cke+P`e|qke~4t? 5'D6 <1'9k{L)[%wP1% %aʿAnW=r)>Cl<f72Qs?ϳ׋1}3sk{|/>7Vp=[cloǗ.Bc4O `8hV\|unz3yQXSC eJ!s,=l,AN3V7>\#G[8zd $&$p4YkbTm"̢2y u//:q+nS~Nis]et9v?Y5sk55 ^A2wY1=Krٍcl%arG$?>JsoH=ǥ /k,vP .N@UUCWÓly$}):MpѺx^;|c(Nht[D ,=T0FI$Iwd~v8B_A s3;Mddo1"NJke ?|!*xqy0]WsGa>q5h^J57 hlVB|e :]#giR2Jꤪ*vKT'Ac eQHfw*4Gl)7qGuVoJoSsh{%?xw4{aBY]f}rQ ̚ԫkôz9̺{K 2z84ʃ#ȇM` ;];u\\ug^e`%Ƣvf(;T4)Г77I֕ ,a/Ҟ3gBMaJM8=s+RUlp,# ̉[3>ܙ';(.' |8 -"˽Xn.&^M^U͌8Uh hJڔrw罤gȭg+B5Y=,͜n+(Ot;7/%'>OVdM(K䶎8 q!:3͕+#N`-%N]C@O H%n1fR'}%lSJڨɅ\u9'B45A'L҇b6(.&ɶ<"SUm5!.eY?i_[G90ĭTtDᨬoRxtQh9ez ^DD[.ް\~1fU{;K`y=]۽l*Z!zƄ>LwM|s/A۸u , sFJGvNyUNaѓZoUYe9,i7]QORm08eV+%|N\ԚcYI c[ m:H^yFriR[c ` ee.D}20c 2Aܱ&-@u2 -u"F'@ǂ.h?fjmb牓D1"C\+ߢ^h -KW,I7lYXa?7ۅ#Z-rB4Ķch*Y!((^ Tx.ji]Y.)y5Zh$"[53j4ߣ905xQ-+zZz-_:nlj6[])Pvg 1<վkͧ4cjcj]oRfTg}waӿHjZq&4Mmƣv/a+~?P('ЊShBMЎmVwUeLVkvUnT(/h?BcAg4a8M=ēIIo7rwSZ'M5|2G-X i~#z:;9|5P-^ClP=M7 Ul:eMr]q0ie*@6MUBAڙq<( 1"q^cc9ڥ-C&޵<+yV_>m!֑']7}*o=ŭǶ?9쇋 Zp[F^ rz #tȱ lʕ崍?=r?:a˷9ɜ=>O銲B /?yGSɁ>52SYwy(7])%NM.2q$7V]/3+oy\:O56Z61I]Iȇ7t۾eW7b!Iɧ] <hݳjʟ_A:oa+Wx#->-}9>yD K848B5v+gK;OSx4p-@,(F"I&juap<l S#W&Hb4XwG)uF)r"d[ụM| NiP;,OeJ0њXFj"@ !)IAPkg湭1bMM*:mؾfR$SS3LS AE: VM H-qO%|35rA3%| h02Ei/s N@y|Ơ6T庹P+ԩ| Pew^s&t0XwH1 ROH- bF@h%1tqH:+8Ah5*վaK)K) I)'dF!kBR)<%8P)` Jł_>!!qMJR6a+D` KKR%:cV3WolʼnV@QR핯lʦսsIF"SX0癤qX]Sѥr6eie47FxhSj ASk`/- $D'qGQ ӼIXH9(ἠ)*xӀRH2 (oE!eSl7ù@QiTe L]D+0>xRdnOhq3@^X}Apj JhZ{C 8c\JPRkp3L{6Fq5̌1 ,3Gc-OfCgLN P4A:D y˴5 4%C̈UM(@0DI(@B:EGwPuK I)i8Jk%q8eTXǽ֊bw^`8iNUQ~ ~)  r@CjXB8Ԃ1^4]m- /H)9#<ĈH}OYd/d7<`3ErIʉbouψR9C5G $ruuUU28TjsAgNqZ'5R-&L3j Fu X#e{#+ik#%aPjPCJS#C"@"3A4(R0%S EqJK_*+@8kW po-+ DJ$9E-Z,mZ ,T}S"EnUjV1i D벼QW`貼ruL&,\ 9TnwYHyʼn;(Ođ`\,C qL(5BXT 3PisD#&`εL_}LQ.ak 21 (,b=C2hh#i8h&!Z!3QF=lmS!1 3Z[,`ur۹?0, dD D+EKNV/{f L[U3Єh)a%U -Hc.iLI-qAVI =Sh)LgV'b/}/E՘Sj!pb?/ZT, 0 $Z;%\Z͍!:i z0D|HőP!,fGDqH04L*d#D8.ec EQc,OpMRi"(̀ǧ∑ +y&l>#pHf(Mb(d"(>ONw' O|;H0Ƭ5gRXܓVQx`8'w+$Uc[w.m;yK]7Zw=4duyS?ړ!Ք,I_&BޗCKۄ@/*xN/F&Xc)n/Aj'*TsUU T+xr4$ Z7y`P`Rʃ~Y8pY6%V m̃4vyʋWDٺoz7̪7NݖE' wvX4E6uoH\ć+vpy`cVUQ`%aO^ )!OP2)%p'3q6ČiNPaq^] vۊj8CV{9KvܥwĶcy<Ԭm/l°exiʕoV(1RQku݂NڀѕUk+y@V`8&n`FQ= `/1YcsNynj T)E!(4d]_SVJZ؜}yП"J9lu\J[b2 *Kvn/'́ /_)י,]GJ)l??8϶?a"Qp.QjCNǓo/O&ͻ?&?tdgގq~_~~BFg?mKo6rt;N TEq _/k|m\|}'coɀз(a# *qt蘯._[k>ϗo |sxx:p9&ߏ̗doa-vh~ ߾xpt*00D*b5ex\|?qoaO>-}rp&\-q3IZހ]lzs5a5vy Om-\xbm >2`X)-!oDb!7y፟]})7bܠo+@gny+uienVhhJXjlnmgg2#L-UҜP\<<)f Nd9c]Ő-_rV#$j-_3yۮ^R =32ue:T.xݺX)X0H.Mq`|,+i ;m*&C,1x<X>8;7q4Cu ~z̥ڥy7? po׉lıh>09:m/5mF 6crv]_>V'cgaWlRw὜>LpM&K뾹v[[a[yX粒..Go$j# YWK(/ 48QU$!nbDtZ&6%Lctфa` `B2'}[M7MƓE9Xj&=͑BMVUa"aI٬vfqjuG"Qr3X>i04ScP{F\fYoq6$};둾;s sEQ_Y67cu0˲FK{b6yL隒{-y}ŻQ#V,8뚓L+=Cy! "O,uE8%rمbbVp1&`Zc'$ܹŞhʻT5f\`ssV vq(K8FꪸLQ| S]\R`. *ZKiHO^gRVF,\!ª;ou7J?&\5%UQ}5Su Cе}KJr?Őkye=/sީyjġ'ѓ!R#-X#%,C+~Z$%ʫ(SԹ-Ij]N>enIryv24͞!H/bHhnp9j/'qxk\]w(޴Re*go>c V:@,I"SQE5]1|kKuifTr:0vQB}nHaBw:foa084U8U huS$'\DЕ=ʞ?UB352c/\jZHC-)8dEa"{O^dϗG+w2EHuyw[ (GVK{q—ҏ"?\f;4ԾP~}-Ce)C+/zB$mN?ytx @?sϣCjPwxLz>Oud 1ݝn=v FH)DIǵ.0jCpSq cpTHuk'K1|j-ܱct4m$p`qmq%k\uїQ~pTDf^䢜R5\0&NDwTWR We*q)fǏj*+^.l))3lۣMU2<ȩFl}/@5͍]{lO؏DnOXcFR{:#*OOA!BoyVD~hwKPe#b%xcjYnOqhfiTӘrD$$KS)0\q-TLu#&4bp_l WgY3iqLgkn7-}XK|1~)̃[>W[>~ApT_;Z0r*wE#~01:`?$ W.ZLiLbmMkk0F#w8 9@d_d8q׸{b{67 S%[v|13DXE`=xi c<\Nk.AKH܄ 1ݵ=Vt):ñ`G''"O'Df7ɟ%)c=>f^{6̖㙄9P#ʱ0_Ni;bzM 1x3}+L@ GUl&:츃CNfLku3:H 4=5́=&6X!ձ5/7|5!^<.\;$>Ur 'XVOTd[%09 k&,i_ `<|YNi&9sxpZ&6Wa]IeU>הM = M&oJ?1<5iD[$;JL#dI ft,d;eO{7Ll84!h]TSYk֛/MuvW4ǟOWO?^%嵬{n=gՎMS%A,rsr7T. 벨IݼϽr֜9U~ɟw74N6w q;q~ժ_~ѯo\; )s'yIB_Hmd\܀sZL ڤg5Wڔz$4C/]VKf{z';yxYhiȹiv{\嵃޳ ay著?Ϸփ?>G%V&w.Y;GfR;#30ޑ vd&5dSZ+uḓ[@#vWVj]6~yߝ\ygNcm˻Xi qbR'X >10>?1|8C1l|#޷kG>&wp?r߼+ĂSI0݋z9f:T=&n^5uKbz;>ѝwkc^}  D1pzfKΤN^nZ\&vjs[&vbkϷor/+T=իHnٶ^HzG4q{CZ~'&VuXՁ7Y `,p`9*|čjYC%f#ŭ}h& Gx'{o*\~Cϴqf 4Ӂ;i v'5a+Mi0(!۞Eq~)氒 MV%ޯf"WpoΖ}lpu? w)19~STwM#,"T _5;}SߛCABjґk<"q#$YͻdEX.I^h 5Ð D9Yt-ߋ,p#FYӇ2# |G,F̮.qbo>]Uma~㟿ZYC'Eu}uWCXDb;5&YY'PjMXv2qP{REؐ;KUCj6(،2u {6~A28Lchw2AQQD7Znov{>x>/<~ӭ۫],2 anov{0HD<ۛޣu{0#< U l/Tzy7G 4$Sodf^kݐH[o$=27fز}X=Hf^ ɔT4n:X?n48+Lr  }pprp3~cHrdb@! (04(R1$R>fX^UyTٓ ^rRf7꧃L{$n݊Um4q>=Z7&Ru}Reͳ=N7p&2ƙqtn?oůyʨ_EuN닫ӮzkhtUݑhֺVW0mii k߸TV%9V|iˋSo)W}+=|s,}-SoYvZ7U}9iLjÁE5g]N[ 'bT&ny w"(P*( (`ښn JyPKjES{"G󇁌= bF1g"Ϭ4R)gU}fU0xhrN6<=L|NKHG,FZĹ!ʑ1m^<6鳎 @ &eQYATsS:tфЛ1N) FC.t}CZB90i? m-qI:KIM F(1elLI5Ty(oSa_8 k9LΧKUտrP/FT Dƍ-D~܆NnWzBƮr/}y"=:i.SsZ9yVG&9ɬ{uAl6nM^  ̡NJ.HJL0։ ~uHvuiuGdNߟ'^m/ 4:~ϊXu@z?bH `$6l $ms>" [608R^ΧV#c2(:p&R4y@1ya #MR,`d9&(JC2(&.>T;m./Wy`2' }ʈ5%jRP~z*s*=qsZ]}j)MU:s[: 6l|^w53%SIEݡ.w*/Mm1CrgpnΈu۞H&ignŀByh2(6@_&M;<==~5R#wm"0CrԒ$ 9D.{H;2I[W,F@ ~E*[Ǿf?! rQHت1Lgƈ0ƥ|rD,ߑqK"y;bߙi;ho P51Ʊi:CJK"ʎlj$E!yt(Lsud]o\X~@aI`ۯA Pi,J%q)[0%l6‘!Xa~*"=z#+tΎcKZGaPLa(Ȇ J Y dQ;rbudoA$z r* _`IU ?,!=僇piRT;C1.P鸴T|Sp޺C.s@fHP o1K 8cNml]zknfߠ N(a`+&c"$"#(HqŐʔ0h?syc"&$(E>[\&$Cl(+rZR@葦(*9)|ƶx9tp$nYERQwP83vhb< ki&(A0F6@c:+*e 4 #ZP*Hi&Xh Tr2mו jr0?+s!9r R0DBw> ;/Ed-W0\cHt/{߀4@Wךx;nAeP7ٿ|/:W* BNwF7񚻏1ъ/b`Xs^X fn&##vcw|.\ͷ֔ͷ3Z6f8 hE UtK쿠Fp*yfĜʼnNer> O`L[Ŗ.tqgHguE(|mC΋. ÀbphYe**z;ie+(5nг$)./y,ړ`MJ չNo~Cc }Rh+0N?r4|tP파Ug>v#3}&LCMSv ˀZraړ*q2J8T8I`rN^aP b H*j9eD*-ۋTC=#| X X%TaV6myR(l=;-.LWrBd3; nfSdG3M*( b\rds+P\rbvw(|; ;<*Ύ:3g+?OfYylz=8q`1M#gbmx7о_4*3`B,沢3WXY2G3A\{X (y|R^٫/?VpK_m-3SPq@&v-ߖϟ>r`5Ekjad3u:Bp=goylcόRL9 EI$$jKJYS̤m1שF[ӧ$u1%Di ICZriTJJE4Z 1-|z X)ŷ!fcs*U^JIu`9fbWQ"{*sn8@ [J´ApS_$qsxWtfruta~&(өaZ"ch #5:f')FYjyI' 1RLp*nlJI `anN D=$tR$\]{ɨ"ˣI ޢm-Ӓ>XӃ J6gaXpʹP#'X94 >` NN|sXOeG(hBDQM:] m]\bn_ 68EcS'}gnjcvC[* , duu|aY|v809T}SlCM$z)|x˻ǩngp.6wZͧ iʻq!$\`_g^uVӉ`TQ1I|/XSጸa~0F_y?,tʌvť{(0N_Cs\t{á>` SB<& NFaG&ب}i`,=4GLbl$uCu{*䜵~g_fY6HOX˃ 8C#& wo֥.Lp|$҉ [ tqn P``ӼvoG›3,*_#q;}rDr03WC@#9&GfRr Xr_S0[gau9wws8cuh7@u1KBU:g5j 1T;@O@.sK՘ԘcBYUc@ރ4ؘҸ5G0yHL͐>65j(825j(GFMU""^]u{޼ЄҌ$6Ș'DPj$6ŒcjCLXS"vZ!$2 7I^n$wQHG̤{4=fu+{g.vFU$*O?~iTΘۗ6<7~[ۄ )CM %qd2#&0\B'NRj \(`[qli& T(QDóB xv\oL 6hL`~r#årNTϿUv䳛3y9^-a0D6]!Ye~!65jP$F&zsYoymKDǐP7|[\X23+wkDm筶yr{ n*OaĚGgb:k&.a/kِ̅?'Sc>H3aq 2Qqp{ m𞫎-Z t汎$'&v?M_Yև'creCeW.4WpUx~#oAؼ֩$[G0i1Q_2n_Mv@d]&]x$1m[&g5S/PG ,бCLIBE7q" zQ$m41bHce)X{X՞Ϳa"G8v;иDL!ѣ ,8L\;6VʆQWRJ ^|W?o`,JD_&br,с}f#G84֥~x!5~Ơ6؆~$.AaXLEy^Y ;[~[!}lmnsʓ?lP MEp42rfsr䍟*qyH3i]ʬm4A#%1Ӡw6F5b#e< KƢ2%0o6rpKWdd)sc3M]ԏ~ C|w)~3&й^6^@eLv6y;}|/.%e l2 0}r zN;ͩعnÅx`V<sC+,iU/$-\ހ9,̸:(k$5 Yjh^ɺ hsiѵ~ ji]WucYZ?xjkjܒ;l9\@>3"@c\1]1uVafQ^pe~|ry3Q Ow6X#MwM L7E㲁h{ T LGt2KhGZX;ɣmh:.sJ\~:1ܥ-H۩0h2_I;A.zV,OJ.ߡf> d?'t`sDu1f?oy%a0^ԊLUt8q!JL=^>d'S )57u+yxc˼km Aʭ(r&bꀴ_]l&>rx[3qFҪ̹{.B ųO&w ; ϐFꣁ|rY8f1P Y$0f> )l G]8,[..5Q('~L`؋'/x@Zvӟ}R򽤔KxQP*bc4|807ܙ* .:E37@$-\|[ýeV_ROYWGv}ݏ3j蔚{dP[;ucvKW:**w~ 5N@ P8z;laVU4-_/ z[7u#f Q|k_?sx4\yf\)?) ~*ۡ[TQ})gYO9+_lCe7pp |qe#B77_^C&fxYZEߴEJb\x]$o'oVh*b%mW_c6\zQ6g]nӽÞ+nQ<%_wT&2w`޼x-≊7(g4ϥ|\1jy f5َ?GVi59tN\!AU_nTCaVBpX3AQlqLNτrbCRJ7僿?痢H1 K[M06S'x[%4:j<R\\f N8fVe;nf $ c&"؝mpʵTk4&fXYRK *&fp¥$)%)fD0ɮCa14dI2i3A }N4֩`,7d(2Ɯ46 !J˴ˆ(&RZi,(b):Cc,&ЌT ,&FXȘ*5eM 9F@, 喑MjlNTppfƌcf<tCdYc%,AV,ABRk{PciFkXMƩ:&8.6x&)L G DRD qAW &׈ Rqb85igZ{ʑ_|HGU40|0`̗`>ׂ% k,ve}! =VŪsxyX\$'f?kA"Ĩ+*&x/b/Yyc&ŕ,!Y(OȒ!E,o I?*ϛ;!>vFbJ̱6ǬTE)Qml ɵS4Jg+ <5lts47n Rֈc} ((E?b̷H̖U܅6@Zq$ed5HFK!!S`*}Cm*$Uh'X'-ZzRE3ƐjJr1f] Y+-2fr HI,;*P #n%Ġ:*D{RQ]J u; TFEz*ˠV$2`%XfbB9QZJÏZ&lnPPL"Z,ʒėy$\Ő's#ژ[M 96hl]hLV&őiJʞTe[}`gM;'SPYA5,wnN ㄼsbWHՄ+kRLl`ɐ "$d|但T IBmU{5d\!S z)P ւH, LhN44Ҩ gNftkJ!၂9<"N0 \"1eW&S%sJ v䩈P j7XG 2дgb MgT gb(Q.0QQvA(8udI9$(Eydo(_uT$()%Rg,OJJsL]l-=+)H"$-Y̼T} 3@!1(A8P 6Z#1@=uʰ WP%CB+|΋$MMd38OA-QLKUT+f1"Ee թܩB /pD} Av0mтY)3߱my]y n|@Lǀ2^,yiwP8[0JFo%JEB˥{ wS y j  :!.DAP$RDM+k2f䃳䓡Z0҅`͛zOx 0|DPV>(2 c`ܬLtUϳ75 V1XU2ya[&K]V{; O^OGUEv>e[}%0+xme,"z إv1} :ЋYuD! r9tҗ 7%۸Aׄ~S v0FZ%8b%dsTsǂ@ATgή$]`6xVnZMbVyop;=$@O"}Q<$~p7XTЙ5I̔HZeJA)$Ѓ\@ v("G3X(ga'TT koH516ɉ`k.oXWHgY g&*Hh(;֍!mJjEEU"m;ZBUVP¢l$B}o^%½4GI:+-nz|}z\,|ώǴקzs+Xd0uvƍ,8zT66\:`-> >;ewWZcor8%E. dL P^uzgE}  iK Ce0Ck!uRXfT*dC:An 4# 7ob{6 x߀uŠRu 4MՀU)3JeDymU žbO((Cc*1tT#!Ss:P9"[`pOSL.T6AcsZՀ['fܤh7SO k֪VV%0쵩Iu: BX DmNYu~5(J3>|7.HTBmEwUdPH` NR9L@Zp8EzȕBjz~)Q2ީ8AϪ= F͖#GAXm(T q$tk@7o@$\( vu=ˬ1KL"WFr"&bQrHlMo"I!6N]Xv+9b4rː\ʏ]t"B xq)# T ǸVE[MzpW5:6(DŽ;~]7LM~A w:y#z;Td.:=߶>PUAn9jv{v搲 pr{|ƫ?7/q' mwTNq/ՍQ=j?vue9|v&,ǨlcZ{P0:Gtè3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:3:wk/Ǩ#P1-ި<:Gʏ:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:èuPmd鋺zYQ BިZQwh!=:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:èu>I# ;?_~O_8ϵͶx`w:d iM=nz]ΏNpw6eJᅀe)Rm6PaOE|XX[FA -% H%e0ŌNCBzYJ6ZC{>`NᅀZCg5ژCVg$2Ky^~',) ?maN}c~u=-?uo^+2 @x^Ƴp2%qyuY;>݄|xLxQ[Rl\_.}Ѱ)[[#ͲI)UD3o|y^`ѷ/W׾p=CCimK'G&nXư|oX~SzQ.Wko7G?^7Z]go+f?W+R!Y\_Vfq^WE_^>ŮxMsO}Ծc!ItNc. {P+Y.h)#kt-5ZZX.l bO|n7vfs +@xazw ^իٯ'K fgC'6μ ޫWռ^MSΊя3w}bȿ'?Wo㿮^9vu&9$#fq5' N Vq!`BZ,ed.e2xf!`NZJ6&sr'X{2mߡV⅀5V;~B|iYȲKY.gIRzK:,l\niO AV@X^Xd.ueBv +,% ֓f)+Y,(酀Dn!` X~A},+O?XF[S+>gcǷz c. $%< #G~)rRF6'X NBj԰KWU-ecʱ?Xb#P?XfkR^clRl` ֝(g9.fo\{(hZ5LB] ˢ]XJ b ;>lQ{'J/^//aq!`IgҲ¢=vl?|Jߞ[/zuu~~!ƟwOvkHjnM/c>]foy{}n'S#>7Yom'훷Q{36qm(G(ܩm>o}0>SUNA)ݾ_X~GcTT0/HP>>0 ˎy}—E͟7d?sl><4y nz?GQ"h7}Z-{@-)swzgn+: 7QK]R y"> {W=E1-TMJ[rMNٚJXGŖԽX/b"U%$1UtΆ]L\jԜ*~-۹7B%ֳvZӎ\ TA@ ['sGoWI%f%f`َQSol<ɼkjPk]d܌TeR?7 C"` 1<ĆWpW=f)bg lQz;mL\,pOn6q9Ih4IDPQɔOuB`?)WL4VXZ\4-eWὐ(D"v@gԅVcĤ}/YKʤq!Y(OȒ!E>%sW9 A҉ksDyNTel #Hr-;:}OJ/I þVQƪM5P>b+V +_GϽd'iIʅY/ 2JH$j%6 BB6 }Қ[VEЦ@JLzVYbJވ⢗chhVbP[bӮ,Vn3,H8<$ۮ\6P=V \.%ɆaJDzS0AT{26dY{q-'@ILv+ ,*T0:Y-\ ԃshRN3OݬaDM4CMP< EcȓX[M`dI,đΚ q7 [B,8Zx7ֱQ U4dsҼ1kQ Ki) OPh5PK DW JG0E'{sY8!oܨh+U!RAƊ&Q#!mLk2XN#ìݪQ"2x*dM'0Mi/556*Lڃ@z=/50!ЮCnb7v]AwSA d\!S d%a8k#Z "( 2)ភCY,͹ݚrHx`fGmva>TK$cM5.J vFƛb]JA`Ip[[ VʤCD8[+ThoS|Lp&%E5n,T B?ᐼ 4|AIWHPS H4+P~wֈj\ $#)beł*Kէ [ki $E$w#ecg9uIu(*(_CXkuڄDM\Q' %߿]ńEcL@q|#RT6:a9I !eEd߻L0wⓃƯM dmKƪw[u2i> kq; A,k,yi:g([0JFI.G[ -sGz@30-EMp`A|YBB'ą܊AP$RDM+We2Õ.L5s4o=Gd%AYZ{[yQx3(nCe6 ⥻BչY,TGWBgj@bݑF9m,9貚 D v "} ۍw:uQ'sOY3/Vb=^EDB.6$ Q! p9X.h /o#TKQ ܣRQ{XRrրh' uCY@ /B>Sv%Ne)pj!hk4^ -&9MȒ,B\=v=ݿ7(άIdD#E|4;V=;MܟwP~V}MDJY2 QNP M;JC:b8+s(TJ"5@[6A*5&)"t j ߬r$ ~*a ~0Dm0ymN6CuznO7zs-+H hd0urFV"ip(t!l03H((r"Y:*5fZR!ΣDjсaK64W|v7=4#B@aQB^`(C*CY tStrzdރ4[Q!B0%V$p'QH+@(\u"HFH #g@hB2*~.$ 0!#I rV0j@zjpvm(T qLt@@$\( :1KLY#Wfۥ>0D,0IJ):&f3ECtN]Xz+9b4rːTYG5tE#B L!#`B߱@[^ܴ[ǽaqX3BQ f|xcx͖ Ng5 30P9%:p~՚cw=o[mh)׭jr<;ݿfy_.3ޝ~;LFojvOz]l_HW?x[)Iwb{_Nuͻ~m!F_Q?u {X9y(;oָ "ꇝl $i`9bk|)~sn,#TR< |Cch>.g=4+dXXH:`;-ORʅ.6h{YYͻxGI[[䜺]]l.ޱ?!wO,n{ovÝqp7'|^nOZ?UZ^zi}ϻcV- v!`5? kuF.,y>P`!`KYsK;@t,|}+Ŏw <#5rÇT蟽'o~ay8/+ogH(e)SU_#|tZʥ+F{vNWlOWte!^7Tv'7`⠼xB#)|W^ěqL-x{Z %ֈxQZo o⃢gqOptPgnf|(9fF0q )6TLzm"uR8jkQ+)Uݬ~ ^$tiJ[ԧJӛ ~<'JNou؜w[ݛշqb~ﭽo&sش2OS|8W}sW)y*<>R" #ὦ>GصH{݄nHHדvuzW7Xx it\xZ2lǛߝÏ!C~0FАC=9qnTޏa{6$pAIU3~dCݔՒ5tU$EX,2o-X_ebLxKHؔ.rP!i b_W%?aB?~ ; ς~ jSiW|zGGa?XW@8/FLP^띑$ڸӫ<$tN8%At,օdz_Z~S-VCβ|qQw?G)a(0l:ϢuʛlvZ5]S./~j/ޖ5Yqq~?^藷߮궈!\^]-4nݶp0ie1sA $,}:;m&Kߞq`/rx ׈pڇ A+W-oSj!@.XRdb*?q۶qnR8YdbcGz3:4[p+΁_W :WE06ʧc)_Z4{5G }*~1fF\gM-;hVgφ=C]ݜVlOQ_,uuN:vZ>сO~͖_OT7WG#d\%6wQ^pkx@4!O8LtJo:y| ^Pq!O1L%әਪRNS0gM3$y9ܯiγ?__;>s!A6 /Yr+&ꇝìt]]kJ5xt]S)]הjN3}͖zllk_fK]RqIW$tqYҗ2/eo͇Ppnn WJgdydi_G7u%VTHQ:"=SVi:!/F0 7hM:@﵋z] 4 fDeARئT$Rj}@M̐5q]Wuy"yO_^"r VI% ֤D $QyGԑ[c樑ޑNWt}.hqv=Q__~f{zưO2 #(/1-S|bqKJ;3?DdS7I9~`3FS1yҸ ;1k؉nbK Ӣ_NeE fH4qQ98!f(&^a33\Y$КEwH9HmsO  !i0HR"h˜T:/I-1sHJrןi{إf,UF$`t$ XoC;%3ft8d[xN2uB y%3jҀcKr5л'gF;{zcp1*n2孨ޒ2= WQ({t@9կ(["d&& 2Պ$F[L-QԎM-fd8լ?>3i+4QF\'P0Ol&$Z4# 8*CnhQ]I.ňjگK䡕\]1?]C> 'ZiǬщk?[-%֟I.o*(<# N$Qbˤ O^exW2k:_-s v(#)A6b@bՈJZNu>%tI8Wq-qp"gQU 1Bw$lLA>Q-/ YT:KX wHODM9w"KI9$Gw%yN۝e`UȈ6&X"1PU6bU6 +DWF؂xEhtILfP(DxdzX[@PcUC&-y0ZQ6($Xr*Z`F851$ !c;NjnW++]k-v<>dwQ.(ި䝄ʓ}%fZGOIhISv8Ș5"aW+< {ڌI>/N.$z!#*{q$5t^Ҁ׈9ޕF "쏫K_;© pgx7!TcA=xno_Gv=~tJb )_S06%\%RufK8E?W7'Ω Df=%_ gE%|QX\.:smшFVmxPa`89ۗ|`X."Na4GKӋsz1X˸ GiwĆmy+ضC=kPhg;lx Wus 6*n7rx-axv]9ܘC< pO[dzp^%y&ؾ7n ]rVus=3:ܩa;6*Qng3w+(U!1氄ƾvo 0˸9TL $y8׺qO-qY`qX}cuer8{H-7rѽV'< \v+'W1)i> ou}by/y=\jNBo#|;KA ~[V jdc d\ ep ;XIsn[ cW 9XEg`qTK2A̝$)0yrݳZ"Y@1;b` bx[ClQ?&.q,.r$g~)o1Pbh!{лs٦J͌NlZMhŴW@&ޗ"ߗ;W*7iQr;&]Ph]G#gQY/^HTk̈́ ! =bL]Q^;s^J폼ľ6|v-:H܀c$W=OɈ/J?`etqk T^vׄpLi}آ&ꚭLj7kcQ3*^ 2 Z(ݭ//:zmotoDEA>Õ֔4ؒ3G8$jUXmtH$ qepspR 'lpts xW[5Z`h )$-땔Q, G̠;<}$+Ҹ$~< CpD .Q$@tI"y$(t_;:ckJ$ g^d<Ġ=J`!ɔD y! SXYb(=֛,\[6P1k/z:/jJ0yG\ӈG~44"p%p>p3m242ET$C #%գRGFD KpYeB4>љ&EÀwG@ܵ8wD9B'l‡L QQ[ *3(Iʳ@Q,P]InF1Ʀ[HDA%QrA1COxWt~xn1oi}g$Gi1Hwl"UdF|0[,3ldo4 樷'@w$f#ԃ?3GFF(FctI~ Evрc̎a9"U D1B'G ^1 H$1*$sZq}Ġz\ޕ&:Ӈ q47,JUfQ`'HppUެ%H-1sHJr5\H $q O8hyyZ3fRvL -1sHJrs6t<"^TĨ5'Y3"3t!v^e2?-LA5]}9A?o֟͛rMܔۄ_V~1GP#I_90\*snW_~*~ZS"77Jcr6OݾL~,U.W=)'mQۿN?Ørr|&IK%;@ O6۝M[oZVͿ;,M~SE]6jmx6mTyѼ@ ދ0i5罵~> }LV {T"J8,˽"iZ5f_]! [\%j^ZPu#~Kcە7VL|k\ǹ Ysvܲr@GÛgJȞVBD`UB7=K<'耭ϯO6x%Ne\9` .ZgiU3{yqv1y(Rwxp*wg1CQ8Q@Q@w;bsh$O"jFh:|'_"ճ)PH(n49DevQ Q,м5)lu gg,uGD K* #QOeM'=nYlp7m0dlmlYk)1N!IM94aR3I8紿Psi;}${}J=xՠ'[;M A,@It#K7 +S|"Q6iWwCwZv gc=k8iuӉ^x',E,2@p}J-DQO-Dw$y=7ɳSBKD+1iAǜ 3FO\Z;$Cv 0sk,P=3VE4K:Zct_Pw!`TXA:u^NjaxWM}E)6:-l] tFYԁD5 Ju 樫xGjtYf,y$?9Ft$ ,GP]IcM#PoF;u\+`Wl p R\-HX*&6= +vH]ˇ 4 Ip>vpG|LcIӄFUqj($A+߅ݔL/zd<>w9W櫻i&We4z|{ˇ&MVyno7q$ǿ,S ,30PRw;c;IwGI,RLY,-Mbd_yr?ֳgkTGvޞ!vޤ7) MQyKT|?(`t>ixe"2b|R?Y;?OH?UÁq盟 l֟oo7(Hq #O>_>o:G@Y٤O7#_,LN\C@6] [ݸ,#ITJFO7liT;up= [u(Pg麗7Ag/ *^*q%ĀE#{fm)Øbz)1 ɜq^t 4ShTycӭ+lt6^\|xt>#zw"Y 0-qCs:b5 b_oyX Z{|c \յYG`sѿ͉ؐy^5%Gtȁ>I ǵp}|0W9yE$q.V}҄[a$L0*d`0fe[Z<38@;eah9s a\KP-(#ցd0d n=AfӵiƁf34lY= WN2WŽ4A% MH.zEMu3 6lĢϗk1&^ejLa\└1@w@0Fj)zIÖqCUC~`'Ywv)PʤEɼ, CQz\UrWgPdL)Uc`y `f;d Иp܀K7sT$KHX'@K3y fAdԈex~]>"/1r׸|Twٱ>Xť'QǪOܫ;e[:u$+GL*_2yj D^f"d:4..(~GNc:kZT'¥D/@ʣ=O ɒ$&prUk;|ư?0qD".IvS Z/,ЫXj0"'qqȌ3$,򲉓cܛS/C %zGQXø GaqTG%,Lżv'doz'fbh9p 3шY:^"KI匆,/IP#O$rFlΛrE}/K1TDgrIaAx\FI".xC龧/+?KjW{`,qE:it2-0;[5%z{b-qAHkT3CQS8F?AxѕjiTD;Ȩؔ>Ș%"P}'wF$ES_ur7pl<%WF>i1(J541^:\^Ђe8R@`Q@<}PwpP`mAu9A=潖%ow:FڅИp˸mƒТψy8}!15cܝFKuSYܨyn'9RҚȊ܀5=,W.Gpws~!"P6y6h[ì=o~>;nܻl0e}z>kT\6dLxPʼnEP.S}L@L7:"P;{ζQ`?dwC$&ƄGٗS*BފQފ;j<57n!"խƄ5YkԄiJiO1Q>% R)\͵0&?8 ng x>/#O9)1g;D&|.::U*Iwv̝e^}௪g OOQ;^Y 3phctoIƓ7n̂f,>#V:12mhtBƌniTb.6P}\bap9呲T(x3ZVc9ň꓊;ڌ~>d!?wC]Os·- !fEp?bu@ZD0t,?7:%M&_ޜGupy9h QU<[hT2Dx)1Ý&N$Æᅁw%`8+y-4Ka0 dw`W=)r@eAK|;/ 6ү#ќ0 iW)ک0IynN&6"KJNYK7%fEqʉt:SnrWAg_Vytu镃ñ.f2L4.Iryk|mܞ&wR ^,|o-5Eo8Tu=D+:=jN/$S ʾSp}zdAp d@ kqѡ KƸN!SY a,(8u"&ȫxI~{b,RBc͏i =j F)#v2F<=^kPm[CRQX U,\5`Gb!n& 9c/ j:zgӆ1\fp>|Roeo6pVέ=D+K:$D [)$1FKA?g'/t\o|0_l U!viP4( pۓt\loʊHm-۫,DH,}8qœTDKQ: pV%27x#wAo#xwmIb(cv,}:G ߇Ldz(\п'fJbiy 1  -{vsU™6 y;D1ň1SxȻ0 Z.xwa%1P[7j]BN\ XZlN}!*JՅm@Q wz@y^;Tu`4I`MGN8$Im^ǜk;} *D)-u'f4.-ž[~;WbޝeaBT%\)}F(gǥu#C{3e\F=5|,\JCZh4C]ܐzЅ2N!?~jHT=JreGQuقwq mFT`a XъDZJSX+'rqnN19?ouj7%wU>K e*7L^ s.jwYoCv]ٺ`jP-CƜ{Qq~NZڡE*{q#8H#ȇd}|^@TK=3#%rۖ]rK%UEh10q.4TX>}Xh&{_-s^tq:8[k>MO,g@;gxg[6Zp LcX=<\ǰ8Qu1Ap?#z&ǃL%Fp#`%B2MrJʤf BZK$3XS8%cX0NP=E^jxR#A#9DDŴ`(fP?. &L+m(l8wDST5бo?T=Z=u@<ۖzjRĆ\9'E.:h9R({R|\){4|z*]<{jXj]4.2)i7[2  ?ȥta~ 83'E<>t68&х+q+S+$Yf.8c#&7UI/gscԂcXDNcB–Pcx/Y#'uiqrJrT9='kw?-Jog'1rTRl "moeXz!%)O-zm;Hߌ'Tg rz+0d4͔L6'>mp L`˺jfJf`$ՃFő;j;5~=t=HvV5umv3ӗLG}xjUڲCd-8&hEMu*K/'@i10q(%'so:zLRȂ=82B`iƬuqryx(,3B*\T..OKpeGgV˞qL8–jkF.ˢG.;P78J&o_5x'u 7D`o_+=Hymqn bPSCܬ]5̡Bx ڐ J5KTS;2G R*"`o܁ G YQ3 )XL^8g<w8CD^ׇā]n10q-:֧ rt$#D݃š"6Iq\;`۝CDTMP,7!8v [ߤJ%ri3;ȴn Bw81ɭu~z ڡ[`3Lif. N8=U=FO 8!U*[.YbK\8`/}}iIbDlMHнC-zq/7٠F-8%LFrc>=# qMXc !Ca2k"uu#oyx';8=O&<ΐh(&M.=@bN!,䡮kDrOWe>!ۅ:^yMJ@m;(\d8gLTN JN!,27Lcc>b>j 7!vCOzc| ܆ k4+#<^;5`-TOc *|R襃"pMM~~U_5+ҶV@Ν70dRj{1ISh_s# UwLg~3 r=uv>-MR&$!%P@|=(N_~2C& }_o˄'ɼa^Ov寫̈́^-= DJnH%}"+w[+ O ?[9C~u$ {m>5!g38ҨjD)f"V\( Z-Ē‚ƅ4zG-e}_&uW>dKt1)dwdjz gdQvHc2>g{ =ee`^Oc ?d:bK~F(ˬsnٟU[ .7Or_+3OF>577JŋY3ɝ gy.'vP9䏅dZL¤]Q<(V:,I&`nR mU,}٬ͦiuPET1@oǩnY 4F94npys~tjf"u=5Ǩ{(zĿ!4wb s<3^,+(4b= mkPrDH窞}^ΐ_ *]FfDBۘza>7f wIiNүsFA5Q`9ZOc 7<{n5S]*upXv_VlngXDEjRu.Bq^R)EVd"-> C11B7u!BuHrbYBb[3wHT0=ٱ|s,g{I՗kIm`Ʃ#A[gc FQO& P\9\[)a:F`ߠ`$"C.JצF oqEߩ'1 @'RVΐ͖/l9#i'g1ʞ&:1 xH0tZt\z8NQAHmȼfPm7b V `:M9k`=|5kLL0Ĭ1X?̤ A-ƉCJ51&F樘/⤗Cm?v92YD`^NHAգ%i|RAG/zzclJAK|x[Po"ϳ_EРo TdG>%Rtƛ S܅ 8) ".C<4z5*054kBE [򮀙7'799L8(TMON2CXZ(>]u8E* RUb NsNba{X맺j܇;0]7w雸#.Wxs뵟e3_w7x^[ 3ܭQwpX ˴@D)IR$S%)Ө˖KGl{ 9ˠ{w["[hjjIEgr46Fƍa݄Ao#*Z8P(lVf3T?λ' o|ሯztoNTFe'R2|+)= U#ͧ_=|1|J$j)b"^#~_Pu*ǫxEI:Md΅_lhGjVvF^1QJu`z;<=Ex)) yk}*5Q"_ǣ, 9S:nyzEۋˎ_f!sek hP D خ`jE4^W8.h5r<<îM_x (s]PXN3Zk@[ w8*!þwK_AG/qzcs~h:n0X& ՙ7 b+Jidʴg3 ;S>ݽ?\`l6$zz#4Q"I>xzi'I(2|o{#JD̹U*=\Lf9x/ѫ{Jp4}*97 sjfJϻO+ `DO˥R2e_>|atOÄyC h p1GnF•?-֭V*GU~Y¿/O/,.ͺ~(3?̦_mlp٧o|)_fQT274!Vl 3~/{&M^iCl?ZoI',wE\Kq_ ΞSkj^_~sf:OOKU XC= E 7 _w`E8$egp)J1we=瘳ZtZIT˟vx&m^8…6߹S,m ͧi7wei6;_dG)R7-C5j;yypupؼbbCm8y'r%?=-w"kx\cvT^hQuc^< {?zur68̵hbu(Ok:w)>ցF#Xv4 מR HF J_YMG-l|RIG2pDP4*߽: ђ_t2 7ӕPڋ?Lk;xg#9'n yL-odt?WBrRrT O!Gi"i2A >R(ךZ V7E*YZ$@nf^-W~ n=OKOhKe$7D>84`b9Kq̺~ #ƘCܚ(6Ʀ) m?SD;׏bI[2dPgOʥH~Rt{&G2.+Ǯ<9h8yJp-bN=}vܫJ+4Jx%?VgQC}AGk.~R_1[Kh?m#N6WB+e`s-Y$jH^ NG6jaiI9yl" ϓ0$@b;lYALh*@~ 80q9m Wj?THyaM*(A,)# g1%Db:"\=fFN~ev)(gJؖ Fn6 S<:Haj~k.+fH rJ^xAfɁPfɝo36SڥBwOoaaL`I3D L & ӳ;|v$c3;[7t6@ޞŁAGˁ7`^yX}n˙?2SQNNaF6mozYO;[ 翭W/.ΫۻՏa`,51`Rc_~ @ji{+ ˥*USWzk4{7}1 _߼}fU9{ea^}[kbCo:OoFyJ{v[sao)Iϖwύ]ؕ&h!SuDCUvHJn6uNQJ=ԋfa8/ɘ+: M&(ڊ#gXk&2o\0v_):B9CyUuP:x ; h"FXFWޑ|F5#zHfAѤ3"žOU=7ҬQ&&~$^=i#wr Eƒ* T+AI${kgH[4n?*5z{UҶ(D^֏>#V*J%BxcK-:BPZw?w|ȰvGL5FGXbM7?:%# `,3Knlg!̏8Xhrq㹭?E9薿D?(vۛ%Uir~v>G?a 4?y q?mW#w[1rEr ޙrs`M۠dpYߺŴ 7X.օr(-E[Gmͻg6?; 9E-̓ewqnדgj,_!t7320?i!~|,GN"uB 9G)EjG6sFN59z!hZ@ø- Չ| bD' 59Fe(r0r޲~HṢz.4(Yeආ++8oOӃf0E֕+GHQ(4829RfqQn`|TQ40n<$$Br"--~pK`u.Pl=VauTv9fLBPpsrbƈRIV SREG4(+whW_RD[pvYޥI\ٚݾPt> E]#Dýw7 AwIk HHeA +ryH#W -ގ8t!Y}u}B){+A`c<.J3enZOC,D69)g؁ 2FR2Y9HKDvdLeS`~>]c݆q"Qnm9 L )K8ANsF(!#@g0EhFm$[$T6]P!qn68>9\%"uj5SSH㧪T{(o`en1@b"?zqҲ}cnq(hqU!0U [&EG!`n4"ȃgj1 מbBc?|!ۿZuTpP)Z'X[([GG\9r]u>=R:'Ou&vhSRa[|/7W4j(S[\Mٚݾi'=GuizN1Њ3 vf)*$\[tk1M OtD-8(.-J=j(UQ+85pcWt^g6$-2o m9%ym+Kr)1 w% igi,]]]]/]mCs9*W!^)]S!&6B! P\>$PTl#d#aU!P!w(CT<( roBWr<- uCJ@=!h Mئʒ#[6pJ'ME`V3cCbW "mn:1rEҹ;x\-H\ !-N!HwJ ǣ"َ&F[z,ow/39TsfKbֈx{W=$ۈ"u>A$9{@&âW9hx-K$%>hʹY,kgA^)Q,nUVEMjxj 󧹧ZC;NP2\Sa$AzdJx,WCa !WЛvQ)-՗E'T).n--_ =xx$#9FmOkVZUMt%M{Xɿ2pʈ3S;פ_j`!AT{:`=MθҊc D|PAǡ(4|_r")ȣn=l`?p^{Oά~[CgA8zx Qho: 񰷫 Fg*~x'(m+x@Y*USvw:6Z}F~aF*a퍓!,Ӯ rY'V>5?s0TyaeHRR\3 U s\dEWڵ5T ~ëiUWj]k6vXU.f8r?Q/m%jׯjz#=%gO S}`o aPhl4֋ík;p$ e bk_2QY i09ל9 3з瓑z紽(=8 mXa@HDŽ(lHJ JÀL@ypt\kk6V^:$NǶ EJH&"QO:aSFlRAjO_2e^(eϺS$P-L[r%Y)I1u8ƮB,Eqԝ» 2'$X [toپB &"> Rp߽/zsp} 0a'!00b @0sPl&Y n vZQuK . \wL\<@&0Ame3A,`2thS,FzsҬМbu$4*mPh?CNC%.HS%\ s6{$tx@> KՈ6pX- ±CN0 6^t[/;];ujV3F^>fHQ]jA=, nok_'TVҒRy>7 0+r`u֟`۵( `SlDžD0>xbylNvT۠X p "H^;خ0"o假/8+A}zn0wZ? ͓jl}֛Gګf  y6i#@0EL2 acaC 1[8m uAht\.$/nQ~\!$!%681Aۣr{VmCt2s` a\`k+> P;hV!ZLYnOH5=J@p+'%(k $]4"F|\RO@gVkCHdKvvm:8U$Ǖ ci/]\#C_sKv5 Сu9)kLT&Zq4T*%W<Mգdt_}r1W+sm]:絅ѹuDt{ιz.h s M+8^H]]zǣsm_d ԁh|׫fN};Jތ{pvL5N{ʰ?z'^$.+iPEKmR_G)#<8 }t~>7+ߘl_\K\|OI36lRoZ[ƃY$&gsf!aV:3̴P2^1 K4:[0`IKg4jͰg&&_%X3`g }9vE|OT?3ĠFtFnC >ri;dSin/9X?= 'YvT.)NN֠Z9(@cvLZ^WWw5J?[e (Ob JZCސI7 V&[ZL@^WXab7gs1>`j=8c3IV%?T-Gh:\ %ı>t?u~ɨHť[>N;7oO}[?^z}pU3@ 17&ĴUzAo֍U_J3UtǞ_deknzѨ l@6BR.si;@IJ=.ӺwK+{XoU^}[_7,cIjejT;v<>+(FaZ5ˠXyޣSYM|(\+X Ob|KvO W|,loWRjx??XNESL1PGks`6tl 9/oM<36~G_̜L^B68kU5<ͣ]%8ګa<0L.eK2  4mil_.yKs*tpPfHgP)#ۃOw_,yh"֣zt;Ģ1!ҟY~C"e0PkWu3󯢜@@[( wz^uVh@G,;#~U:UAaf$ Lʿ`.~muˉi'K8 . w@ 5 QzKEA>@"!RmPqx1/m&9)Ѥ_AוT1lQjA$; h5g5aiNMG$7|}mtA,Wbd'^ٱVm<1lp;Ym`U7*,ld+8zj+XFnr%mBGVuuId8hCͯݤ/>/ԫ-}?>y%iKeMg+>vu#,,XMҘǓ]gI~D,BZGJ5\}J Tm8MP)oЙ6ՙnF]=Z&/keL4Ri겙K61J{]aʹ #fLe&"qu6'+Tb^T;*UVN'&‚'G M 2jm9PJ%JP䇌S_H!HGzK#BIFυm<"JdMMMM| Yq{DzsJ;WCfa E͖^:sR#i~C/y๘;u_ 8[*A;pA0 -P@K'9<ْlLeA?2 Aau]L0J`:1 CUb?|f@LG"P@)A\ϷhMn7f7&u]8]'ʼnDvq"8].Nd'ʼnFq8Q(N`n<!yƫ|S͈i邞rYfY-,t݌aM/S oVv~3KS%b&Ǽ<) -ZhM_%{Vj7۞I}zmOG%! {5kjN=`2 ~QL]=nhy 9I&4T$J~$I3EP'qF"ccX`epYPm. {R; Z=.9c3(i$G J΃{O?{Wm~+[ޜ]ԭ}0SLRݤ߯1CRCjC#bobrt૊0Zq $꼄jQ|P )KwʓQ-C26 Xce|7Yepm|18Xf r2, @K?C3C.JH1b9ĭ_f9 6RʹY*W i3Θq8n[}R6N:Kʊ }M+%wڕay[=r{E"6ur Md}&='.uh5n? #5ѝ6&J-!O\,y|=|Il> u8kkPcF4.]Ì{$\~ n7=I z:ܵwgCk|˳iYiA]6a5\j0JKrU0rGi Kʸ(Υ[\ڙUl>@ >5-=s&*RTL&\.R Q d"%ݺB]c:_/U/;ʟ{2qVL猂+<x94ZS6 i߻Bv!5UEdWJTk2On ~>۲~7j ]B\(ȥ%ߋ#ă'I%@: azkS<`?\_Vtx ՟Jnʤh#('tpJ2FX:өq/  _-,+O W)YȒJҾ^\27r~=1Xʋ0IqٯKrk G2%ٷv~~6]yA Vkbf mthN54xڤ T듏P>$+:͟NǓOf,ͩr+nv5  ō|=ʇחW2s?BǻYQK-Ǹ'){⢾nhs75vS53?> F0bb}Mi+ś{](k uJ~FBOIcq7IeMF`]ʯ ?㻋o{~ǿ}w~#758?|6E{_ `Ayc ko5ӼMײKަ_ӥ >\>[z^Z, >$?}z}{=MiʗmhM4&Qp$QdذoWЮI:oOkR3z)Uz.Dxh /ϯicpcl̑V p <8Nhp}Ybl*~Vx9pb#252uIC;a<$KvR pP@S#yb*j:S:K ˩7X#6zYLꞺ:ϨX< |誸}n<1t@*,.Z9t7!Ӝ]`}XgOd ȣ#.}.+W괣hϦ+s࠸+8&.헲w;} {sMEQ}dctOŅWST3$p?Cn* Ik0$2C܁\FOGH"u@: RDx:8RDh: RDH"u@: RDH"u@1: RDHzcy{#u@|.RDH"u@8#>H‘NdgX9Ș&WSΏ-Β`Gde"ͬPRj,3qn ̙gtZYD fV,"f36Z (/Zc6A_˿5G+K!>lU"G$VĴ"p,vőA"!.g]Z` qr^[$!OXoJ&{l&dBTm!vCRqzDk#x8*y`^$Z/Du:EooKۮaE}/l}\Up.~>gvBN#ƘCKYͳr"{QEUCZ;"VX5V7xu 夿 ja׿og ÑeJXZ3@@!gE8L&" Md2&lʌ΅p!59'2\Ns꜂ <=ȬZ<#5Etm.2Zن*,KhԹ\!bzLG!G*0ӽM%6 -~Nx ަCe@!<_xT0 FPg3E_Sk")6~Om>9\KGrCe@)PyzcQa,JU}盆ho*A\R *q 2yI* d?;7ڜE }倵9_#ƛr%&Rz>gq9a7aMˏ0w\߯s gݔW.t2Pjk6`3VK)z|CgZT 9@h/@;|o$pԯ<xtk Av|oWy"̰\drEV@G-jt+K+8]eM]n 7hP<@'vSM#Vv żNӦNRW4+ri"`D׭j@+з y֯w^ˢM'/hY嬌ad 7 })i9B(Hh,b!´qWAi;vwD)+5l,/2 \L{2̍?/wC}yh# xH(*6 } }0L@ò32Jòg$_oo[/zZjF]dCSmw$}:%~xUqEG -mi{T wUUҐN])NT=rW 0#7*d_UV}5HxtW'4g  ޔ^AZC|ǝStW+ML鍻 ޠ+Q}*HY}Jz5''vyEjU;RSk`m]qv|GP$NJXޔ&&K4G_ RK'XJnI4pGZla k-nV"VTCZa9aNS`9I㚙H #Bv Xj;Dc_xXS[;,"񬐵씁}: 3{S jޗBh& AJhb!CbE `{Bĺ/*H鱻+.PFwu:J`-+Ԏx@/]ᤋ̎{0;>mKbVk$n2+;E=Ůfڪx3%9KY(gW0{1v $xr{U`zr/2&ֱ,H #1VZj|Xvv';9^ygM gQcٹzE^{9['sC[oB ZC>Ob(sr\& 2N+2)Nׂv B Hܣ;l}P^N}Wzx0Z"ʅό`l[By%jhJS0--b8E&,̄(Yʷi 5%k]c eR FKE-t5Տ=$!9eV!Hs{O0eͷi)&k)v8:G"W3"+@T9-f۴ЅCiduy8DldSA+V[oB' ٹ*`Z4b\pa۴ЅTյ(qqA VmYmZBy:u k籷BYeXroB'k5c$NYr y8C(ߢ.Jt>r!'3!J aX8ǩߢ|P^0uVRgcyla(LQ99M%([oBKڹs&ᘡSFz@'L"!S+%o۴ЉBw8 ewmuWTY6I’(1B K_H6͵WH~1pJ;V^e2Jag_mJ +B~qqT3I %!fSZX_¼gVq*j`2hQ?%6`H $'E ~n}ϊ 5>|pGmqWfuBқBg}GèB 'p =o_o7Wϯ}[./Yq5CYW|v^ۆ$ 7m0q\}SY>%j8*:)n̗s} ~ԬNT  =߻n9'y'&OiaABlBj:/KMr86(֝SZX)-ZX2` EzGadk;sV;r\M[m,TJ|JUN'"xMoNƭPpIUNڟ‚j3I[VIHZy#9r1V{Š]\.r A8o~Tۤ0MW9)5SZXJO=W鯰J$џ8pTL.&OWe|4_|g6O&HWY99=#`y.fJBWKQ/+=9·&|6tLO NYgzJr|Nf%+. ]P|t%93]}tu)E{ no>4O,|DN2}qV DY&VBp*\!g= |+\m$"^'\mjЙO>G|>tK6B~t?LW_]rpgDW5 ]A+ІDO2ZHW np ]w~7S{왮>C%_8/LWCCЕǡOtuhw=hxMvd<1߼10vnO~Ow9ek!Q3?xO?>;3|@~/vyyUe50+ѫ'wm5} WNh.eo^r3w٫]>w'aJ?4p?N~=|/~Ly wѼFw U/ρ~yZZ>>5 ɯ(?1ޟ-GL§e R ;0_zuk ˫fY r+Z{ttnp,Yҹ8BQce2_@inȏp}C{Aqvˆ00PWXtøee:JnIT|}X/#O:+V{j$FUBRͻq`cZcz98 Ӈ;R kk1wsK ɒ9-y^ :E+YJ\6n#C{(*#Hg0ZNhlXCx-V}^ի!!V%fJ92Ir,։FΘH-tOȥ=-Lf p.2s3ƗhNXTG-fSKq~v2{M{_^ގ-!}.K#h#a-M!;șRj 3w :wV*xkch5d"bߑ1Cu vz@#!SO$Wv~jmȤI#Q#'u.BR$=&uJc>7B9*'enjܬZ)QG) M7l9)|^g]ɇ5s]˱gf䎱u/=jj((Ea{ÀJv msoJK$Q;j^4}ur/o "ejr,GKbzjɔ+P-( /S˕ 3̃1K*)"(Z; ]K{C qlΰ#  Ž,^&UH쏁 1 r%XI (0{4;wh AJ0Oǁ]$OK@DŤ!fWd~ɐ)uc36 c-}$d]SqkGX$n4gi˾k`RGRf^հ1w p *Ud@ժs/TfAA7j=׻kZ v[XTBNu~T}GyKml |Y$!NC2ޟnW?0uvTauN f\b5u(cfsfPp< bρ 7oL5G)]t`n?€D42ݏEz8KD(5'38Z.Ԏ i:A7 PS+PmR  `%4x|étEV"RO8Ri&LBi5$C@ {$zspN0ƒ` ҁ;#ϊ yj-jP|MBgѝ}j(1ά< TZa^zk{%E$?̂vHF>30Bp_l^=*! h^ 0K@8؄q1[^^t¯r;\z87Ngbc5?L)$tt #8ofמ&c˿j (uD:ժZmpT97F 8,ݥoi^*3q#I8vC_ {{I$;2FId1}Ӷd6m MQ꧞*CwXÁ %$@/!fojR<@adns I.Jr1d ]FTzW*@% Jk=xPkԃbۨɰF46Q*$ O 5 qE@1\̃ ,e1Si^`Zaȅ ޡӢ9#ҐGa2⣪7G=k뀪vi, .c` DL ѳd9pNiu5RFXSށԐcuY+`qI<(d dWQWGHqS-XԀ7tB%J¨y2m@2|tČp,+ С /`ex_5O++@1LJ,NsO&FA&T)'I%nK .do6jgC UUUrh",0eѱb`VgD8 2 0D):(`Iklt^gQxwE@z3- a[p#Q=܋[dzVfo_SZ*2ج74z]]n9 ۷G 0Пd A!A\f@1hs˗/Vz͇&ף9o+1OM1 cNżK l0|lr76lx3Πlz|tą=d{ G_q[| gSl) Y<+ u $! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCB꼘PaX:šP7BVFB}XVHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$Yu> usRGuZ'w^cg:(ԁP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHlB;o|VZNPc=/ȟbi:fhxɍ쓮^p w+@AWtmQ/P5E0j0 j+g)byiLC.bie^dA2C*? ~?';.[ԇfx.g'줝|sq6NpWCr[ 9Pg[]zYBLUc>cC0st@v:~d,0sW-q_]S0J:/ߧ}k7\tɓ:ꏛP͞jBOh{}iP4m9\I{Vc`@|F>+㿇V!|py 2tY1֊ƛ:wf?Mv[] GK`:lT0?,5ج6kWsEZwXc/<- 9ΰ$La0x7L*`xPJ zl#5aд<{{L1[Ղk@-D ߥ|N]`/6\ޑ+GnsFx罽s)ptbۤy6RYqwzgHȿsmr̗`!_Nphj6kT }dC,i2]nK 'Nr1uS j7:}p."G<4 {ٟ$kx_J,L]/M-&CN[#Q'wUQwHr:N׬RB( @JuZP( ,7\ZA'9~(H"ElPЭ4Guqz%i"}n?K>YcU͢T[:{ԥ~1%(Qlr6[58a.̝k5p͜(Vj d<1+?AVqQwOQ2o7eǽ?lL Ѐ%쬅E◮Uΰt8 {DcUئ.*ݬHޮyUGuŐ^+P26i _) &q;󧄦4mD7$e\<>45>iSDmԵU2ԔkٛR(kL_Jh^ E+=R(*E6\WhyB+=ݹGiqρ7kQƟV\ #p-py~K`b<ʻP/~Y#^n\2z(⪃s^3k!AZ`vԋ6l roK]ŝwnzӆRBÅU 7pZwJ•,u uyX#H^JǷ-`]oq|UT_ ;WhW^p֫ۙ`07pZծZ) Wm7[Wċ'4kW4׿YjJmWs 2\Xmx 6?pZeWh5Whm#zw2`.eo }᧒Z L\!\IU=+0Bs \zZi<•j#t+4WZkݮZI^•fN >ϖcn=ʒb<`Vp>2b\D[4پD[Vж{Z)(u?۟ߑpVJAppeQ P \-WhWhe+b\\9=+4ۖg8\:\ #\yD Wh>0/p~2V Cpr!Jzpu/z/~ڗѐJcpWSϵcFАvW`/ppVR- \^Z#vl+i~K`/DoJ- \/Y5#CRKzWh\Z+vJ4s]~#mFgD7 1vsSoǚ>޼u>+n?^ Y _΂0ˋN'GYyZaj8' WYHF'+Z8'amB%JVDV)}y?-_o'S ,?L`&Vda~/YV|3N< gۥ#OW#h0 锦M^^pq_MjWŊ\}g85K}1/յ+ͨ,z+C"=i+hXns?i,`p3rOi.S8dzB$m\HC>j+[[kMj@59kQUX&kOcV>)C庝Gp(]˟i~n= *6k·õInŤ8>&A#^xG (bAB*/~3;~>7oGF㏣cXiecr־ع86/VG}k'Okfޙ㵫{iq.CGMD/¬@É2!<5yytqvY^ٷMoH󡙿yozO0;Yvqnr'pq7>P9G\FL@,'*Occ`M<ju43b\:MS}n}ywnЗ5hx=?Ga8gw8a8Dj1W'\py,@ဂ>!¶/y lx fg#.![%"Ͼl xV\=g_NrN'][dRJE|I W6J.B VNg+ zzweKI]U9#SUrx൬U~0d*$+nSrѻGһ^N-Zȹ%̄M| Ad ⴜ ޝhzڧl }}OmOA^qշqE[sn|M}k({YN}8tr!^_͡ik C@{ D!c|:"te̎-/&^urX-3T9Ȳ"3au".ԛrN}&8][UKSR6,YrsB[0:P3`2*(En>,~okt%ԖŨL63'>wj*>sC Y+}|1dMuhkrJ'LzJ4J&Aq}h}I?`սλ]IiT{)Wu!o=ytkץP뫺xӱz.LQV{1mq$ gYR5]א297ZzU'Jqw㾂~wHmIW40ޏZ?l0?/|+78gb[bb˒\tU kK]G]*.b3*T^pm#o&*KXɍLgX7'*/izؕEw5(O_}9}ֆd#\}bvq2(ŊCsZ8R,ؘuϼJ&D%ubA< ,>3%, QUWIIYho0Uɳ66{n=5ERFv4ν'Lx~* _-B+ Dˏ4_񣻛F;?0-WoC]5syۚem\* êJXҊXE] gL2jNB ;Px@STn^.Z5YX%Nm;W:BȹsԸEmw0q(pyuf[d<bU/Xr˘4.Z SLN;%qRf>TE2D\i,,J _ְPA-\F$\ZjQUUc.4VZH.MUg+WednφW+n? ~'ajY:M{,ABatbmdɑda_H",3===}M"e!G "jXs4Cȅ5.p]TiaHBLB&].;?LkQ&ӧ'3& #Jn :(}}$HvLg$&k*2Pg`^YEYEΚuZ5A[XyŴ tXo mDO 3 41@N;{wgσw;sAw= 06^VAY0`bUZ55WL-re:Gxye.yƼG1FN ZVz$ vBgOl ލz}h%]3!x'4`t!_(?!v>Tp / s( A r7*_1jyy:(2bPF_Dמl?,9c.)fofOI0]!3dWJؕ< c "B)1c+֋m^Upv5j>pW]@|PbWJd j,?x䦰+V;B(5gȮDC2&iW6]!v] ]4 ˥::}QD_E.<5*,k)a\ o𲰢}`11.#=Dy^3FU% ;nO0LWhdL3R< XT4ŢhEPJ̢z(*آZ>eCȟ#D4:`$a+"YMH []ϴ@y7QsFr߾Z@WdU[fg:4'7v.9_ VK«!:tNiF z.&`ѵ 0 K$/z$P~fk2.XI-HS,(92iރM,P2JΆaFoCf S8L7xUV} |u>Xx ]hUjt~>?ܕWgD a /5V}gZ<]^o=Ltn,8[7-?ܴ̤K8ir@AD~śG)<&p}72㋄|7_ J0—: pO,'¸]h?˻ۃRxCN'i~k:x~A^e\L8 7zy?t߹] D/ȑ~ufѨ$ P$bi[{?'G8n4GG|mOIkwr(gK&W42ݪ5:aoz:[w4cNTg}kfn$Oꌂb.ڂ? 7`Ho]K G3a% 螇iR|'wS19S)/ 2Jt&'m ~Hן 䴙FL-&)2>ݺ=˴OA e(4CP@Tw\TP @aai3ą9dgcrH̔2QlŇ z U>8Dܘnq |prt$wq|1`<7u U l)>/|cΛ> S$pA)\wb!5>x߁FY#B/)d7n|.[4Zť&n Xlm~A;R- Z[n _+/^6&~qRt *:ɜ(Էss)GnKY˻`(no0D=ۍO"/~YQ8{ۍ{'< ()M3 8(Rɣ"qki6:>ߍyA(w@3'㩃9^'@~9"'rn i)~Ŀ\Wy(ʀtDٕ QzH7:Xoxܫt7f?3@S%llBq?A|O3Np5z> O(.1X5\Am8~'gO:пh!lwO?toI X !f;1ۍN !r$C)ۑ-klxFɇ^$RM|x7~{1dKS@ѕy'R0%`NH*Ao3[dIo0 7]|g ܶKF)Ƶn(!liIL˫/Ԫ^ۤ'B"i.&L,1 ªu\τiK; 8ye1 a1<Ef!\piI)>2%Z,+\@WRX  Z%+ YSI<P-.8.?DT5Y^kFck5WXRz @>%^ k4Jaa+FakFϧhAYrzEt[mdK9˓xNGu>EQW*TQe䩅*]Q*}%)Jցq΢#gfSWFp8UUs43M9p @*p刜~&87c/MUfMISx$ڏ5 fOӘ3kqx*MH* QzȖ )fi̞kܓ()4շ/4}PiX6G۔:6=R]CYɢqZl;mnF傆F5,(v MS4{DxN9y9lM% џ䎉(>Ews3<#xv8Y0p.l#Z-ٻ6cUIÀ8"{a SBRjwOr$H<^lM=Uտ~z~59B`"N0^ U|jw/|g'#r9gVic`n q$wel3.}ۛ{cH:>2,-ntKŝͮvtB䌂J*cRb2叫*!ϮOor pKp3"اgLp8%b"c1g;JaظCL`w1oMq4dsؤZ:f=3w?|;~ybRnVM.d^?r hI20C6,eG~Lb=r,hGeΩLqXt<Y,f!Ynu '1beP!2чUtJ`w ZGK!zƌƁZ 1Z+j$~Wr p74I'")[$!il8KC;jBRuU=y3f0."T#8x0q Jd`ߕ7{({o_7`Fe3 nGLfELFj[Cvn>fđrT)  42c뀭8@Eo]{ EU6!`BLJ;f2RʃprdZ/5^#qOנ5r[q~v~}`ie) RBE9ϏEESOe 6UQ>4 D ks0x|$1%qIEH:byH蔊}ߥޞ?{݇3Lٻ;-(?yɞ)OׇM@?`;CiWCxT6C.ߺ$xqu0p}A;v5 J?~;^qunvqZGb΄nwM t%u=#_} b~[ˣb.YukV }Zf.|B`&*V6L{mIf1Jwau0X7_ ?%Y#2"}ۇV~g[rc*rXՖu3 cn]*n[\.H LgS˳aw,CjyvzˣuOSu6K1VK/jZF]{ryZ)Bzt[N2/bكe Uy~B [VRTuRF%6%%ԦP}UזM-d"a}(9fUA7u1Kvoh@ )Z1^vҘVUv2 uB}mnA+M <@P ˳'XxqO"2[TWkD1D֜O;@U/R0Dx6Tw2Tϗ~$kԊ c ~f KnէEHpSN tїhMs;skECgnϗ\fM6+!<ޖp%ZF76`e`1y̱\1i:2Dz|)T{KۤaE! $dКZ9a5Xp9 h1>A(GwE,1 ?wo"q4_V&;8m3:m5ا/`2I9Lz݃#aoYvh DBa:}0 4_V9ӻREl q, h*g2&,k'^swC!HRK}֩! R0IݹƮO.EGXfHP'{vܜr0(QmGs+:2,"1rbr[ )@wH3C@ѴJODޔkίet;c~=v8W.AuC߿{ Ej__כӏoNjToﳂf"_L>wƐi%Bv6$_0\䳹53 xˆ^Fcyd!R&R/5eDDL ` Xy$RD[~~o!BquAc#XịUh1ZleF͝QFGM*$;A Att:LvY_qv  p<7ZZ3Zg3QLxױzCnZJhJ0[0-:l=5lpN+9?srTy:ߌ x$ VwZ D佖豉x i7S^9rt9(ͽk2rZmm-:\Ȭo >]nJx23E5lk=ɽSMg8i Ʉl6݃kyOsMPo&>_3\4#g D%3 ٓwh{I?a=q-?hr>9 Sc<=bz4@VCDJ۪m H8GYl%MRpn N##ӱw,rQaHXG" Us-o|6RC#246ɫsք*\_y.w;E 8I 1a3:9yk}@nGqqs2^>'ǞԚԍ9Ϛ͠D2N"ђ'qdt3jEi^ANuX]5EYyfBUn$_%ǠQ[ק4L2Q-}~aazmY8Q;'`M|sp7vgڰ5WvmytE`,~oNKrh#޳jY~^fq(_:e58BЅ:D0[[&QFUG{=xK(&Ņpl0:Ac4&lͱ<NiIQ2,K06}j&Ï'- i`~-=I- =ϲ뛯&S$I7]~N t-áBR⮊zWolAhJ_w f88L2Z:u)1B5 C]C)|@˴|CX_FVUC۳9@y.N=N[w,J&#a;oX]ΡIګ>iե"lWzz+FQc02jl:{؇ $D\mrY)vC)T"%xM<4>F-]hkS{ 7)ےIu[þGT[1F偉:XeE7z.5|_8',)=G6l-,;|OCްq54OeCOݦP@f±#^&&eDH ߍ;jt'=ټ".qG @)LqѼ@ AM@6ch~,tx# VǖjO mjJ<4*4M>IK s` iR,b$O. - A]3A׵Lۥe:63Y J-3d`\™9L)`w1R Sv..%NjɜA9$rLeua !2>M^x$hIm [}xs|mxXYٱx[`f}`:uHao-_-X4'4XӥNZy9 yc8u5i4e9aQz!3d'`}4hQZ`6n`#d>Q!=IS<- f鎰7 s: -(h-Ƕ|sP,1㜘q %bè{祥#@&^׆-mk Dbc: N-l/P)>le4Ju[ Ǵ6@3L4ɄGEcFLXBY!UOo:w){޾r NL+\[*.Zz=z+o<3,8BqǹA߾I*A@PSj}XeJްhWr> 7M\Rz@moGy-|TZV9a/`a8*ep"׮d5rLX[)(`4H0XĆpmnEa9nn9{lY)M(\Ԩ'Hjy{Mj7()G|PUR';4r؀m/t-H:@ H#LiTL*pPZS6!H?ͅ> 뇺fYVQ;૊Pj nR.= zƵtm̮x[8eS>Ac!}A4֔) @}$\Q< h LMwp==<]X< cٶaZ͸3)ж!567uny p:nTy̨-sfgH6ZS()A $X/aHaa&`:۲oD_:m l\w"/v~("U(p[wYFcjۛF- 3. FCaku¥m 3䵕a&|y&`v@D'[(&/Nhm NbF"CPDDP9}m=xxw}9X_ ŰD˛wǯwuj4IYvnSڵ`냵+E,h?Ljϯ:~]v& #j')K/u nLS*&N]w. wLM뷤a|]u] ɉ%!&р6"40:pbVt xDs%!*7Y*Ns~*q m)sa( ͑=I8G(uǝ: I %>"I#UI$$(wNW~SÍ=K-J(G[Q|&U!BcG7\^߁Jw{h0L"T?Ait=8X5jA"|{ݦIT1Z%/DtBY'azskƏ)2vίϯ6v=<|I*A~FEӣ~Bȏ)\ӿ,W¹y~TU] l>)ApW*_kDN[~=)J=ң&@X<|*{l ULhcYVJO|٢TՉ/:d(/{J``{2\JeIO0X;#Xg jFjԀe.T79]%!Q/Y|iqpo<#10"}A.!.nj$(̍]Mˠ3Ѐ' QL>yF0QE`AhO2gD9p ;BOCey4K/5%46S1F Bdޗ.vƸ|Wκ܁Ʒs#_۫y-Hxy埛p`5/#^-oJVU;6ϝ|U,Xu|؊茲$sQ4\“Rb\:eQV|KXsJDF`]@8b)EU[)2)׫I45yq |wg2\-Cxbi74'a̢n!jhe==V>/1oFiceA,-/`V|O^C7]lky-?-(X%F.`;^ kQsEk z4:-Bqs|+1~DQ<mp7' vAlYgkbȦJKLS7uwޯk WzW6,j3)QWxρl0jbR ΡrX-W}Mr TQq5MD%+W>AfGr at/7;}j1wY^+;Pep{ U)VvPK̿%nk5Kr{fs/iFM:{yt gi|?RpڹRuF8Ohxw{U$ONox,x me0Oyl&.29}~~On$ۇ(E6HOz(yz az.R}~sY|?3(os1? 7P2uM`\g: z$poԿd&q+MKidۦ.g6jG}o-Xu!<jSZZf0L'>'7Dʄ|l߸R2);4= <×R԰Y0C] + (Z̷ LgaMĝRa>8aR[h('G934dt|{yĊ(MDt63lFYZ\ȩe ?NkTƷsyYȀ<-Xpp\:U7{д5;F kLzfÌMYB?gnPkNH=@1;5 ]jZ {19sܚP <`doy2fP mNCۗF Btw ː4i3{dnNO[wnٻ6reW[{b`{-A 33_jSR1,!@*S<^t0Z;3^Xrq9I,eHZ;ɘM5MA8I"3yj7Y I$@|ס .ML򙣉$M,ISNf iJ`&:))o%M(+_qPc_`7_օ?E9G * 'm-#)D*8#mIpc-M0M32%]@PR%29α;ʉRғ/hH%AZ 9(p,=|Zb)b̵6j⿷J+ ($]-$%o}r*aBևp cAD?EJ^mӍۺr)'G{Ml 0`8"avm_ mkո@ ^oF"AV2,304BVIDT4>L!3 570!u]+na±W˖4Lxu;{eP$,FouVwz^cLF(81zZeq*Cu WNmhlLE(sLJE"`߻* G]b#lq-?֛Qfùq5_{asJ6۰]Ь>'Ƨt=2Agv3, 6[AG Y+ysWLtwاO1opy4rz9 z[ʈhuI~zooF9BsM'՛HW?^<{?/j{5LDm(azV@f&l9%Uu ]S:Y娋 _lC]ِӃA[׃5oW{MsPjEfuid7{BU3O2&L b~_XM^LPҗU"C|,dcqhƘk#_é\Đ;Q ˍ@;Yc?9g!QF7Hu>r+W_o: e68aƲ4I!L)GlRqɑ`H+u)sD!y@ 4{_ wLD@f{Ë [Xkm6V յ~{In=1*s" ;#Qwvs(cG/݊*B|'  7PSáRXۥ#*_\)04{ví L*3`5ib>$f8K9`gFSE+wGP'YzS2+)%Z#XE ;U Cfډdl-R w~cBL!v0R_=؎,yE ♎xDF[Χ uQP@4V8͸ !R{;/UuU92Gm KC a0BRJx8% D:(Faάє8ɰdg3%= $oc?X< z iM?Z}c9wח5jCX>dŔ-͝7ZӍNZy9fYS ~ ڏt!$H6ыuaK1REqѿ7ccN3ŗ]rD\2%-f.fe*DiFTjƐ6\3+s;M|rm$ODPØ̰*3Xi-:jN82Rg6qI8z0x;˘ӄ>?=A=ZŴBV 0X9ש4/!Vqڼ[EB*VTv3X(Cbqj]j,6j6)PEj"~eP͗r`Gx; K?0(P2]he|15Ogu>ɜlΟ' Jt _O|L-Ҍ38~uO$(ͱXI4֖)`PN`8Qܨ4EBg.cϚ}Wxu)~,yW. LDqJ!M0b<5.SHksL$8c z%,ZPrs6cG/`MWNMB,ևF_A =| :a*a&%IE$2GXfY8mҩX07z`~CxJ `*,Ԧ L"KN`>e 8ndkS[X62oHƲ)eP-gg.I15)# 2J fZ5YR*S̴8XpZn$m#iI[=k [Pg|flxOnWEXk}O {mY?A=BT!㿰~nW*[Hvu^Vf9Z;[ N߱ˑGュxԖ{QI0X@e @%RV Aha#g{vVؑ ~G~ީVN[|`|0-WV6zW#tlLTEV[i}|Z|]vn_'{+[u^./U8Q1|>;==h=g~vVo{;:Ίi5J-z%ۿٛ<,#ÿj Vhmeۻ_c/~5%(>G]GVk\\_F@(BG&_se!=[dF>ގ^Jn\ !՝y_ţN-/>Vg/Z)~lߝMb2 Q>\gHW%#d=!>E\y M,~݆GAtn?Z{tqqU( m>BڂXݢKV9Su*W6W;^G^F2Wӧ*O[T؍>!:ϒ#DlKUo؜N=K?N;?|Yo?!3f8kuy ^@[yR\yA/]C mi1-5|և(%0=Pnu@}` 4g/.,pPB F#Jhl]vOa-d@o=5\lrdi-:5,tE00 pvovgţ_hC5ZXۗmv&G4_J2&w!f) &I91(|d&(#$KdJA-% )ʆHq\/S٦kqH(VT n;Hy݀`H[%n+Ϯ[xrP#vzOL̋Q'ڞq{ZM!iJk3kH[L*-4 <{w >fҙB3{7 .A(4S~!.63E!/hRxQ@}w&U qM rt=3\ 4@0l`j?\ڄ^M3\u*Q]݆OZy3 zT{K D0?vF @ghOK$LBc&T$F~*6&xOoD*Fb (Pʥ9ΕÉc.ak4V8BhP*4Iu ;dW̜wu?:' }|w'(w:|؄Lm8(Y.Qإf[ d0tzy`gb$-N~2'^wd?Ba%/֗ss&;#gO{&ÙCÔ:k!f< " ogay^>rFǬsr!Oܖ4?)~0~'רּ, joCm_Zo@I{AyǙ.0aF]PV.CR:Qu/"T5]h{1TY;1"-޳;|s6ZS&40X^}GsE؁'I}!\rB̟8 g&f*ձrqIqb N8%,~~Q7:A6د_1$E˹D2s3oeY}3#>Y^(YZpz+t`j+u}{m04 45E?{6vhoUvfvj~͹@@Dmdw鑂;{sT=|"nԺ|i,{ .?gJ_+'7,wɿݢWly-l5,f4d£ $uJe#q1/nj'6s| zq w{AXR˝ 2~zɼ)CK 9_jw^qY&du2#^^w-wِ:8Qȍ1fCϹ~֓:A&-T; PF>޵,}xޟsOyuX3dcK4*[>hsŮwwXzuVM}dyUtۆeͬA1|<\ mЫGVA)wA'dyybj鶳WӎJ ^Sjװyl`~Ͷ]8CM"crOʭK'猙Mye&<'pZ1}T׷,?qf؞ߴۺ<"wLj P >_XbRq̢+t *Dd@_E_*%N6O4WHxh! #5+uݎ/u;D+Qpl\и64>Qnql9!#Ur?k>Zenvyrp.E}arٻ}rAnyz#xYŞ\>D`v@]YZFY\itŊ2V/eϏG•jƹ_de ׀>O$ڝ3q@Q$~G[ֆiݹ3;d'$V ,ɖ(qYӼяE*ը >X%_9hp9d,%Mӂݽik~), .?-Q_vP2'w7tBBٯo܃mxO ٕҨ Hcer3] z:jZ>,|.2pqfs|tԟ)e !M8 ebpH cIq ?i/N'7`݃( "Ax7OHgeIW<]Q(4L^gw/s.!U941b bm~gfiKip"0I݅7Nc$q,Gr]zU S=nc&$LK [hɄqqF:G#NBu'LvݣŢ&7[o<*fer뉟?xi<"0BIL i©86Zd0-EZ.lC8؁a41.N$C`*L P&c#ͱRskA Jc&ҀlP(btlбoHb 2JP@W$ZR-GX5YLQQMбi OM뵆A):<ж!Ϟiw( RN)BR6a:z;q 9Bf{abB(.t iOH&)!aA|Iʝ0.!?ϾzCMMm P& qp(!E#"DO3$0p-JH*RjRC٧GqbƲnfԈo6$ q#A=vf46y 9 ;&M"S $!D! kSC.l3ެ ODCL1 XQH AD^%/dz9c;^U$+E6Q+2#!mpE|1\2Y48Rdӹ>BÂ:?O&PAj?ކ|_kJ@_E{%:==_]T/d{bntY_jHW^mj/dOT'S|ީx:nuw^g7^/ gUp>PE,Qƣ˜$m%Pr yzqvq6}N'{Ozj7w#Ic7Uay hl:lPSt$hd+6ZdS*E3`K3L7}3>A(kv eQQ2>ol=qӻߞ:{|o޽>9짳_P`>ny6xu?gʮm]uA6]rC;Iڑ^̭!ԼʹrVnT ͚<:5)57efA,_VFڗ?FLfB$`+XVn1`pQ#--7W1SĹF;Y %}eu7-h?KŬ(f&0-b1X?U˟'`sv3[H 85 E FH[IⰍ#bX48ʪ Nݡ#}3U#$O`F'9~D L?*;^$ấ5$ciP,B5qtmJ&#Q˰NereL{$hAIUd>]*y켨 xc^K 8CզC1tf2)P`g塬,crJ|z=,IlkI!% ch;cLʒ$-ؾj;TM!cHJKX(E$L:YYDH!s)tdžGv E%2wwXu3"]k.흇 ݌MZHl0hߑ//sxOV,06AmoNMy>N> .h9SFcͧrUAJ9M4VSxQLxhvR{5~#X(ʈːZE„`ra٘ttfJ%xLo8s8(~8wRYpn*_'M%@/G`j#L6EOZM6 e 1YĆ :EGq8#q,qf;bF%b[KOY$;Ssy]0pS dB3ꑘcH~;xz!d2Rc2v1"i,0=! +Up$O< ބWEj72HR-~H} (;~fL,d|V & 9q(!XaZUde%?OG_AN\dPFTގ^5]> $Y!Iq.eCpO č27BsElho^~Y ;ղm̨ؓ-Ξ=IO$4bCY87RXpI25ӚL+ljFU_Z +Վ;_E ҧ0Y bϲ:RFTȹGh#'K t:OG0^{`4b3w A@XߝWPtL48 ~079-~@ @Ξٌ{,YwVM_v)mfEH[ާ(D"gi6.)b^_|ڳo=\iVN5wQPy¾_borAϋ&Yf7Le3>L[HNgjǑ_qQWm`SλZf٭)JGqRkI|$v,ޏȲjZOZR]J/V'Vҷ,d YU@_գ+V<šH*hlCK&pqݑ^./{;pMDf!c"셎^r0 ۨ5ǵ:8]6y~" _`j¶Sojlʻ6Y]uN)<+ E9 K>@z T/e~w\0r#X ޓ QYW`UVo VVջ2J*ZFY(ش`_^E3PF|adUe!.I,:1uaS'uRnwabf]t:b.6uskEK]9ulu8U6s!uM{)mI ;e.]ZuzaToXwiRGC "/yFۗxӭ'Dr+/7vQGWʼO,gANQnٕǁT3m@vwW7i3\ jhQ:Uʵiԭ}һt*E%=L#>we#r!z|`wqq-MݮQܞaEZ(V 뉀V.K` 58ZgHW4Pg?S!~yH12Ǖ³Ju?iMG+󟥨Nug#-|GGj+ex{_s*ȆKT{2G<Ư (O-cl-)2~}%{4 殿NPa?'UCR̿vWzv &v0ڎh`w,D=O tXJ1։\$:2ղl6M>*S7̩|sl֗2swgV:l슙\c4rޙ:YH`]?*%tqIὝ7W_j(nQK7x^ 6ܰg7tEsGfYq7sD*msUѪ_;Bz4Uhehѱfihj1&ಓ,=3yL}yh>-olb ƶ iCwE~;>ic!LvbD+ |/C#J"~1/&L&t(|Pmrg \Ȟ-^CtO ObX0 /zʱ 3Z'bj @m9.I1t1->Y&D?yaFQ)@GsNIt?)s/2@=Z}7i/+-gA  7q?"X=$V%-Pk8 Iw>~uZZ> Ȱ (JknAg+w|}nJyfՒI,.B o1*(6-&VG`Z ]Uyz߈>^]tC!#Eiޱ g͑{Cw{nM njf_/_>n@$s-Oxznu{H_H~^0i&c%if){%. xYHϭ_ EBϧw&6hewe~Ȯ_lҺz"%}؏{@(fO;zKE1 GdjD@2s_>N!1uT%O: FiWJor>H.j0zC%;vq7<7Ho &Tc PyI(o7wN !v)DdqfWi59M;숉KPs4qMWdNYt9"i>&0?s _Cߗ 'Ls\m+zI)`dH3` Ҍ+>+ S4k4f;Mu&ۅM3Nä( S)Ժ‹ ˅DޘBp#HxR&”VSE"H ۳um@mSY88o&r-gԕmꦅF]v!Z+1_*XoQ]9m n'C^:[ԺګWWfTWĎnm֨+N+NP5QWoP] }]1qS??g;:ݧWr+)Sl9<9"Olmdbaۘ\"˰MۥE5h:!2j0 YVu,va?*"]qSD` a뻪HŞ8>CWO|':fnOtFF0tMCn3SͰlDZ=J -$/u MhuZ݄V7MhuZ݄V7MhuZ݄V75Vl&záծaӮ i݄lSiZCUh#w4YK-,zT]fY6,/MTאohif9Kڴkylkzv zmLV +qo"'ޟ g30kUcG4&AH3ZA-Ws|bb 1} xj_8 E3rCضfyrOڏ0kIx{\Lqu͵Oz.eV n!gD2Ce`l"۳|athB ܬ?ǝS|}ys}tr9: Y{o>woX +j8&wW,}λOW+T0׉ss\ssYJ[P?bFTZlقG7q`7_bKד021vj@ָkDs/R Jktkz/\Z 9l+б=\|DSy./|Dq-v<ݞèA91¾A}9IE&C/Oi("fdbȉr+|_Xϟݹع\^_v.~9r."BӲ]fXZXZ*ώ:>tb |'뻐u ;G*ߗ0(O'Z=jg$=Fw#5 :/vE3{. *$g1Z_Ժ7 tm(H㟿~x)))'`5=b]xɷ x8ȓZ~Ѐ9;n]kg$u eH&~Anɻ~12az[ojQf0׏gmxd1nHwm:#`M5LH#a$UMgj_+~ ݪMP4𱳟 +d}$w]xgMTnX_M! {ZJYHam/<߬ǩlV`mLXcr)|4mR˞D=}aׇm2 }9š{nOY$l t$3}My/7GM"k~!e{8aa6a7hM3u'?Uå}NnZ&R #qT*mLAS"Oh4OwU.vYҕKN cE \qS10FjcWs 9 }ç& ^AQNNw '|i[\pĻA HRܫVŤ1( _yo8{ r LZ޽F/vOm?D{F ?K Y,vŪ\XX%rX! z'2']@A40mg9S bA2&/d)&o{&M)G;n؈¬vO^Q/W89/-Ec\v~_L`DCy٣YV΁:((;iq؃5HKxAФ2=e<#bt'Yz@o=`@3q+>SkONK&M̠D {tYo4uS;1)X=XJ4OZM5t"~gRNר%`tð5\|-,  Ldf ƢF\1 tsݱƌYӌ.L9[s+YHsjBVqEj,>3*^42 fc:kY?LP-왔~ZAP{!3@u ƪij1LT6-~c w[&Ef}봤R7𮴗:w֛wh,ێa@&}yKCxƱpEXEXٰ&>a_}eU7й2Hx+N_5P49z,ݰ/n{7n"[|TK=|[w9;cI+ڵ7f8ER<>3v <ߌ݀!51CWcOA͡ȇMAfҸwX Т 1J11BV+㙍\PI/j2rfj2-fj0XO`[{Q2i pCH6ܣ>EI#s<ݧLg問Zzhkn-ucճo4vKd4g^Vm,q?TEDl~n2*E vVwVd. PDM>` 鉺]7&;u\{';46=lž>sd%Y&ͳĩ#6rODPf8'+2K 1j&ICWA9iw|=3]sokAyWV5@KMnZgrO)=2Zll11,-|uČbLq %-ى)rw䇛  C2,h]Cc96_VEӌxHFlggeN7H:F X$$V8E ͘ H`jL;r!YU5Ǥ/ 6`APkkc>_і㋯!(Ȑu:Uߑ^-Y%b"^L^&l Y$ `?6ۏ,Iڙz:X+/W$LQ4eY|f.ҳ!;K($SY0JS6\歄udyC;M)g0m$i*C̐%*3Xi-:b2egf&}lW:iijYْlf&wփ[3K3 ]X,pu ^U..x-΂vQ<{AUת*Xko[rpN7DT z}(/O)t_Tv󱐈Fg׳I^0@g7>/k 6G@z.؜E)ChץҤ)Cc\ [ťjbR epLqcQP ޞyPrh#KDQѼvh!z^g (zY~Y*"4^tٸ'YPM{Xb6OE8_J$KJ^]-s ߌX#mxT18O[AxZ}P>.p!JQ*qa-`ܸLa#QɭIQJ$B (]5fm}_ƣ|d@y$VGa>J&rRRqfaYN[h.kaA~CXʥV! 0)e5DbSеpA*I4 [ Kce%I)T(AK9"S<#)6) bʉj58-N KN;hSlI\.DV \Vkk7 5os|㽦MOvvϸ y~;q 9bi8*I%r#vk ?p~;ܮnM.Q>Ӆdף=Dd+sjHiq[[Rz6 RFN2&JbtH՘e)RJ"DŏI4a3Zqbp(iBlj,fe ϐ3YbaO Umא.b%_t|p)dcPE6Ȥ _;ۇamN~qD듿|ճנ|]VT9RI$ep(EZ &Ia2#qCZN¥sI`%Z _~zMBOg ͑hS9giTs:Q30}+lJ0V|/?{f' 3.~dJY{?OR #@\N&()„Üx},M7. ۉԾ/ԇi)F I"ᙶ@25QaX)eA3ZH2Rɽ|Õ## lvgcMg?wSPGcSZ0ǿ5p#rfKqH^xfADvJI}yfȯy}0yѦ˥S1q鑄{G=FX(` eK5= /8{"M@YNg`^$JZȸNE[X>-SOEv2G(m:?N&~L6 V VMB~}2:Y#R8NXjC7#dj3tŹȞAr(-ZyuT6nY~.+n|w=?.]g-ť}4qy$[%Gyq,wl'oh֑(km>LlfX>#;2 VЯ^nhr"k'ݣ/6jro1j7l#~`Ls_.tiK!ǠS$@*;Mۑ]9?w?;_ޝ~çSۿ=;[N=z ??n#&1Ј 54WM3t:[W glWnw}XvGA; Z0!E?]|?ʵ}[:y&X#ԏ/C<ρڬ24!‹tXz0d贑*:fp*:S/ ~blsO6€dVXJLbt1AK{28^ fO^QGʸJ2 J _0\]Ra%eLDj0 Fݾ#| V=$W<"G&2 n20ML'4lB Vv>۪&V#IO ʘHb߃Γ>t+|:"]YOoZ{iՀC8Tc8HSr!H('ʉrh('ʉr2sI}\[e_xN֮i/gj܉Uj3*b)F)ۗE<>+bҔ)9jAB+1<#,Œb+H$Չ%`9:t>Of=7׶XѸz}w3wM,e;_??s:C)d4$P)lƴ2s"Y8ǘH6Ax4 Q8j6]NR 7ZuF#,,Iw` unL~u1*+۫{.CcCI}~&m"V™ Ϟў6ht\<BțC)ijg١vvjg١vvjg8vv?CP;;CP;;CP;;CP;;uvvjg١vv}P,UV`G!+Sދ>j=?j ζԏeJs`bfiB`f~ $[ i2{ .ѫ 71BY DFzCw6<3MR6G92/r "h׽Nr5F SMoF~R=៞'@['Wj:F X$$V8E ͘ @m3f_C!X_{?}5kE`3\Ips gj{ȑ_`pw/; &D4jى~UK$ԴA<&V誰UPsž|OU=}Uoj}&~g}f-w㵡>j`e*iwNכxGlzfY̾f7Lr 웷 FӶ$<τ ?_fxӕ`Bds5내+يVV,v#+c0DVfouMfCH2_l0`([N`0c8X@$r3WɅMP4ArYƛ`p))\[<)B4S,5aZiT^浆cAKufͪb=a#VD_G60)\͇dmaDT/ £ p%^2Ut^2ԜzgX/+a$B\BWVv;2ҕPJ1] 𳕎.;ܖ JNNWrd+I t;p9WGߖxܾ5_{ah/˓cD5ѡ^u?,PK<cޯV%䍊Q5a*r撚h"Zm2:QXi=ViΙ ,BBWVvee%]i) mD7k"\ ]!^Vΐ! xvVx=]!扮Ά_0& j r ]!Z!NWR{]=^^ 9=]55UU#t̺ J&zS o#+ldjg5B]`C+kE4tŬt%_&:]`.H4tp%Tw8/ΐ`9PB1L2{?:Th|;O8^|MOGԘjtBaun"/*K1*T«w DR%^=C^CJ_2)PBBdxPFѓZ5_'/#K'a8 \aגy,Lp3 c2&YI|I^/ϻ}쾜TLB"wEb"C/I(E8GPMELtYDEhEDJHWl4Ӿ+ނup㱮ʵ4C)RV9ҕLR]!`up]+@Hx;tzuA'2֧NiלS3.ִTDwTR^:ځڝ^uB>3 ]+DiM3+ƈ&:"BBFCWWX uk'2]8DWǺLc+@Km+@ɹHtut%DDWX2 ]\͢q~J0CL]#]aJ_l۔E ėUD-VTE"\&cYm ]_mLs\mgF [Or ]!ZyQs+%4"CWGc]!Z%NW22Β H]`Ay4tpy4iT,ҕZ+l]\sԴWF$z3tz}AG^>fppDћt҉^:T(HDt]!SWFoVɮ4<(]!`k+K-]+D`':⒲+|fpy4 t(uΑLш sO ~fh9:]!J G`&g2*ɈJk;} F#U4Il4#4S)$D&ZIh2gh*iU:69բt(m`ϑbȽj"BRBW֐:ҕQ˘+,.5=zBB$:CWj]Ye' ]ZJ;(y]2{N \@fs[U'5';9dЕItҩ ""+,x4tpOm]5B+O3 %ΐVDWL+ft4tpJoR) B0#6BZNWRDWgHWcDۣ+,"]\mt%Α$1,\̞13+B:u. ID4Zj p9ЮR V3\m]`ie4tpQo*]Z+lJY J[ͩ*E5ZIDWG+#Y8L0t?Ѹ?7+{;7jo+o!~5_|d`T.vbtkOB9t4++C`ƷsY_OՍ{dt/f[\yT]@d 0~$dELk+g'Fqa `!/܏/aRW|%D6r;\+ -YBF]?,d j5Zq; )qs;Ǘe{3)=ݴ^tݿ?}35uпÿ?C [H3"KN2VJOĘz~?@PM`.B;k[ wF>Pj-O!tokRw7l>o\B0է|x;x^oxd܆>|l䡪?kko)5S^f,AXTmj<^~-UicYXըڈ\ǭ.oˏBgwd U׎I^BPdP >Z.(}R!e%SWrget}j0|gH=ġ[a^8gL{aɬ0qa^=|XZջvw3UcA_g! r'BL{%i!K`0^[.*)93 mTŤ?._Au9ڔv0.IV9mY?iYᲤ7Io:7T{CI򬓆nѫ+VR#+C`(Y^_W.ρjm6.jǡN=>uӽ,)qR|v #Z@  a|n?)޺İBƓ\64O][[mH\vmrj&&@\~$Ee]SieFmX'z5Z.w/<)f*8Ny[ =<|ldtֲҙ=w^N4w'?2s1yirJ!ߜB.PN);¼>{ܪ.ALX1RN"e({)L7;m^]<ם8Ӫ;KucSc #i5Kl1 d$xv2]!>K`yRpz3^I+> Ϊ:N1۪^ O nq<-5%E)em~X,D.L+IJ"d*C&J/3k )ﵠRzVNeӸsc̱|:>6YGH(򹖜N)bxЄmOlOu]x#0E[غrK(ֿL$໿φÍ/K_͇bIgwnǖ׾8!W ϕ ը2*zGKںmg gj~-X:!rFA `hn)1H4ӒsLń@-|5L9+#pNWjmSà Lȧ]Nl\Yݟ+t19ʊ WzOZ&}G!kdGbm`Lb\5 F^ F@~I9~r}UwL kZ.<غL Q@8U)T8 WwǥwݏsG ..\HƛтI^&ܻz5t#m~Tkp XiT"@P'9ͅr^ٻRʗ֟BUw6|g:'l<fƙQ,uٍN+ O2r4/@M> q{ǘ!}g] C]]K,[Ƣc\8maߋ78Ŝ9s˭#:F 6"|ucNLI(wp)BSϘ8P=x:Fpn#iZ+I\=]F-ƚu=K~c})%t?_Xsz;hqĈ"~Ɨb Mj#JhLp*sfΕ$ݦu8M{4Sr3Nw %g0BYxKn$% IUxo b#ZN5>4N@!D!e!x0k5f,`ZF 96MVHKDkGAn?vSk IM ר]QQq!,$S{+%R :nTcݑ]s"=:}w"7l|֦(śؓŠWJ;VEJ{)B}]_`cDGm7 ,3pWk=q i-:!Mgi!աYTݗ_;ymz]{_hdaT'ZK(Ir yI St$LEt!"'U2`@e!\H☁E 3  D9.#ȁiǕݜ:xL+ : H[XA`IrY$53Rెю:N8.99bKٲmn!zUK=;ee._g T8Fi$Gs&.7AKf,GsMQ߻{c޷#7 B]YNd;Mx!1bHq/GYlJ*FFw,rZEAqSb "7i6"ap&LI@ViǬQFYJy>%(ckgklgxvR/] [T\9q TkѲ Li[:lh`#ƻXYvò]Hp( )Od QIh{BIE_ulB**HwD@4:&(fEwZI% 0B'wlGM T/ Si^11+{hf݉%B, *W<+FS12I!Ee"JQIAr9tߗjڅ s(C[eFj8 BLH9P=$FCVt.#5|5\;]E8;]UFZ]?s㒾whpfg/Z$KpiG4LEq0&< I|)=arn\8LR)"/{{dp5A>d% It|peg]{yMqlz6r8 x,KoӳT/qQ6u6-˽<3ra!4#u# !(8*Fa8*d`0a1iѰY!z7Mҍt$FmNR T<@>]zl+fi>٨\cR5/@ʗπ/O^|u?'?4m UCSvZ99wW9qKJ/ڞ[ւ(:b_&.]΋IbBKYz3_}68z:rV@ Z@! lM‹w18c\#=TmM-S(ƌ:K01 a?i }%`?+à z#(wRO0/O:=ح\ZX0N{K]q8HFkn82 Sw߻#| VwH k`9IG%~V89^j)Vp=0ӹG|\qM|ց%a!?k!)Kn"'`B [cI"!tW`,lᖐkc&PcLBQ܂9udXD -J)#\"V$Jn6S#ޮn>|=uE3 P"_!NCgtB;bwŦۼserOmTҀbR9c6([!Sլ̾zOJO<0ͿxMZ9R3>隉dg}S# i÷ۛWg+vζb=u7X'UeBL!+ $%a@B  QA4Y*$l9a,;JZddl`Οi2p3 (). 0'bq? >޾J+\!V2s|l{s|Gdٯn6ԟG)[wC} ;Z"3 nX/_yxHtI L GKP h^A0pMf:ۄ\VZ`N^o#Bn͝R919 \J9qޜ8AɻO91A%;ͦ~y6Xlbގ,Zgi8du*A0ܓ%ѻV%mX냑.#P$m.iw)JZL)%(e\T/(UWOA\XzJުOJ9EOo4=3]4cD/^%PB1BնG6ci3E6V `rc'x5N)"rNS[RF@? Μכ^6NvϽ恕tc়5 hMqg&e<+?-<{CT2#H4!q^7j)2<)\İZB_y+52E^[8JtKc'S)<1yam?+lB6`uFǨٷ# ,YydA kN4^J ߂b"XZYl 4BRh6B UTknJ)>IDK1r%TH2O"2Ԃ%ŃdL@(j < E܍;6̰ƌ ֑-m3 JGgABz)ҵ=@BiFn0b/)c/!l=0wbGIm6QF(P@T ap.SJY`85>F%Q&Yp` R!&P a-*A,a f3}8L26X܂peډ@w*"E:&MB.7A(ƙ% JVEB[CvJ=@؆u}NVmC1]EleF͝Qx:(~J Χ$:dZr"5o6 wmmi~q/a`Yl0I^v7D&rlS͋(MR"GT껜S]Qb)IX1TUf J} Dƌ6Z/b&Acj #Y{5.ՑC#(.Xl8\ !cV+d1U60<۪l( :^-5 X q9YBy&nDUPd _&܆_dp/C%OU8FlU&IqXbYGcSqL[q}ָr#slù -< ;T%d* R2")zg4bD;qΒwTB\1`!MM(M0 qh1;@:9.g !./5GȳBmTzr%Ōq2M!̃V;@C>cwX,50L|DbP*c*ʦ[ciT66% ac52@V$}@8FVJ (*ጚ1 J8 92jܠ`5pFY,<|\*Z!uϮLD)ԥWGE]% IK\Q$"/=kjEQ_U0Vr!BI1UV  ѩ-C^tpVc+:c"& LgtYXv3~2mŬ*$c HQIbYj9rGy}fK{tf@ dnIKDBKJ7AyA3 vXmAԨƂBG1 P$bDM W$D胕68Lt~Xe۶Q@[) /. D,/x7VTlh >g,XN~T} yrx*NEɜ/+II|'Ӈ!pݼ[ŝCLm]LՐZE}m6F82=}\ZQyH! .p /룫#D ^X RX|*#m60Ze 5_ 9@g@!t9hs^u@h/kD#+N`:(@K"mQ82Ggm]acVwΨI`&Ybm4P2~2w1`D)a !ܨ0̹+ 1xXe(! c=H>K4К5'7tc,o+,4LkTRp0U U_*}jo:Zz7*-Ia. f|t*2 c%V`&k1ҘW7v3擮|Z1m{W\ӘI,ԵW@vZ #I`d*mtnLo+MA߁Qf1k]0!58EM*5kk8n^o* 9QpyВ>? Vr)p A(P`AH&P#F' \ c52[66I#+dO U.R|-_OhHO'+jw2f=5kѿV5L@R$lv2骙kũfǞ``+LW9nbr]~aSpx=c,?fS8* gYiSI]KRt{xnquyQ<徔/~_}Yt2;n +־c\2?OSK!h%~Z!L` Oo|*dV.9'@}ەIvZ$?$ ̶-WI)c7vMIjNB)\$ ! Ɲ $lr{*Y+űCf _!$4U3XIW'֬5U*JW0]&M8mb>t5'֬G+טdBJW0X9&+Xk?΄11]y):%2%t֮``Gv+5]tZѦ^(~˿?[99 avZoR~?Ի7ޔ"m-:tʣe"vi[DyN?gMʿEv^X,&} 6S .:qb Bh|i-OzaM-:$FUM,J0$5ڛ Oި2zFGD񷕶 0l]\ml:nοgO1?y|?Oa|)w\`OflN^_\| ⿇z%=]`?&i_VDݭ u@ԛALW;BoVnWM/wI[j[G 7WG/cSnl^ƻ>/ փ݊۝/n&P:F;Sc<|\e[=@< 64N}gecX΂e3?xߦfqvqg>1,7|6_|߭q/u fnE׎n`[@[[XjHBJM#I:ˆvŭYr"1NyLW˫SbU]Vn?ޜ]ܸ0se[ArwV}Dϡ|M?~_."b,S.;rs&-p7]; uâG0,/O=?jB>m?퇨%]p)wzqlo xw?5v&3"j6[wsd=~ѓ=Hn8$.w$Ot1 JܩhSdhClsHWe>*n-dm|9 4tq2 WS]~ޏ38)lݽ޾~@1[5c>^^Uû].>1jCv{MoeѬ.UYp׭{Smpo1RZZwpNti\L;D-M;_m :策1W!> Ut174Ogu{CuӇdzwhn-0#q2{^_0".}{>N-~]h|R#rpp]VҶg NGmLvIYD_!-4>\T002xg>|+n|󂕳cH+?MO=x>ƀe77G7ܪ*MYRYgk%)ndbȵ*9)([ |j|&m<^n[wվ3.T*͈1 Ǣȭkh"CQsz|;k .3ow9pkiz?aĚMw`n<~2m!߽[8q9v?qщ3bϵ1ĥߩ=ҾgD\?$-^?O9A02_[Q`yڍ^:j:g1ꤪ V#LHK^j*2@Q031IFp!Xb'*Rޢl=ٳFãsiG7Q[ %ԡg__4֗;dW9]mcۿBK[g_ 7+.jBd$ ݢ-GCq}Μs{! ef wm<iN$@Ra,HG%Hb$J҃EX`:<& L@p!wq 53F9OW4TD2'T|3Jb,`tM6U¿٬epSIs C=.A~,qgzGrny~X3׻roߤ璘i={@1tA[v-1N~|jLw557*b |y.0MzNCs3Z1g// Wy}2swa, ͵fM:u3? l~YDٴT^+Pi5]J(!h ϒ~ТqlM/01S ƌv&a8Ƿ/nY2#$ eV'GNRW =`N{\RB[#W܁)OY M,o޺w`dW4AEw]M%1{޽6[Q<jޘ lӡ֦CQN<jrڭ - ;~n0 A`!ؔ;I8N_[Sc\%\ܚ,+Aio.CۻAKB;͵J*)|al]A&x, ^U:j Y.:I \r~B҈o)L%`F&Eܺ|o-`5ghqLe`%9M|]GٯǕfMF~{t鋚fW0|f4e-`wY}z\ggd٪3oɀkpf^ z)&;s[z ͩ{^rZ`N?1wqRK9xj$g\Kr/Yfx{RhMJ0Wm{'* 3d̯AS,dl < P逰109HysQ 9cO?7@K r2.]}.<6dKe/Bd[ CtlLr>#4Q|,g(.|dҦT6۔?AA-8vjp֨\&u[<xpvjivN`H*gRbr[ܷΎ,7fL;Lykb%` ?rkoU yq?"YAhLp*sfΕ$z5%DbzgnmA0581_]g YigS; $% r*z͌rE[ɶ%n{XFlqt x$ lzyg՘IDk1ĔPo4BZ"vUvu_p$Tۭb)_!Ikԩ4;7 qf%*Ű[0Δ7sŦ~upXhѨJQw6TTu(90.3)sJF(E}R>RLqK%bdF/c]!n'l衖ыsEq4N1z}>.ݨ_6#!7sܟ F&Kx$N0+D{0EJGt`ZD"rRBh7a iU3AbVA3 ('@׆TK #ȁ1ќZxLW$5pFBb%#11)DXҠs0I̥Ԇ]XC|崖ZN{HNk{==8BpYJ?Ӷ"Q#N[-~%8l9fvz!VE̴9?bJc1F1sLTv[{sZod>}rڬM&gv?#7 { }}82ϗh{tMYњ}>-OP y_&;i0 R3>鈧5MY&٧:ݺZ[7ד][jq=[l[Ch!RJ>|w5JP6|_X@^r=D) 5(w+LVd͚]d 9&r6ȹM Ȣ`;XqX+?B"22֙FjۭK7kwoE+/.m`\͇֦[oQF{!M.ٛ Z{Pm! WW `F\}W ZJP ڊ((cb?qK$F\%pq%kKW%{$Zp"ߺJPrي(8RxNQڳ)wQu#3lP]B|aon,Li ]c0 fgu=?f5*W.R7JF,a *QlA$/D|o`d B .5QFbPé.'IbdG Zhxb 9" `SLFHqiٷNP /FhO[z{#eeҼHq%STG*G\%p5q>n*AIH+^gR}r{$zo%zJs g0zxzׅ{?\eͿ~8.۬WೣZPbx6##b⃕!Q‘HZ)RA2 C \SgDJidB:fO+#ֱjl/P (amP r65wT`(@TGXH*sWҿ: :9P]a+͋ \L ʅd$IvndV;)E~: _-i݆Aݩeho.xEݵMT*0(SZ{1Iju@/b=OלhLunYBUjAV<j`rz++Ƭvk].m_vkڎ mJvk@e.mcmZNk].mg^Sk!&[wxHtΣ9&(sɌEh\{#𶻠O{+hNFy܇r_7 P1?H9 )L)  422q"[ Lq0 ˖ږf#QmB O1jveZ2^jFl|bp6[uTݻbu ˥o[64} 5CSź -|r˸هN ځ[֊(#`R<ݛؕLh5*$Е l~]uLYEwTi]J(6".N6Vm6:x N ۩:M°9ln;i`W_6h[HI`>0I|K]u9HFkn82 ٨{lH#ˊ? n4y$]}؃0ZdxXY=4r@y!T/P09r;Fk-rr e|w$E#摍)F'ZEmL:Gm_R8/FJá5\`nVC9J c̱r6E{%@R2|%gSԄl (eCC)&Xl TCac:e^ wE%栳C@ŚTnpͷUXƷC_l(cJlF %eL *0pQN;9m{hqԂ.OhL|2vL8O%RFXp4= ۺ=`wNU'y] y\e[^4|Fbۑ' !&%W8ӮrǙ7x6J!^ڠLiR <"Ŗyꨙ BF{_n*x(а{쇫 Sq2pr1o:>mW]o"þOUTWF0O\aJWFK_}|5f_Ź^! @H פ`JQI)%3ZcLM8[37fW6miٴݾrQg½lݶ,miia_9=D) KJY3z6>}|aM3y@r 8M(rnLGEdQ0A[R.k%OF̑@fs=]o][N>_e ;;>[~;|Ș)qΏ/Ji)ե WȸPFCɢ{0D1),#i;au }*$DϰZ/oj=voLt0?3;enQ Ú-bK0_6 v~3p6ܛp=7}NbELֆX2eiQ%sF12SNq#Tq|9ƕ7/<>%X`T9F$㑅H>HԔ1тFQ4`pHn,uFl8}y3|V+?C/nr1WuịUh@ 62( F刊`) NP2}Va9GMYlSxlwnuv#ۍ@,%CxZ>h$t-0|ɬt$rHlJr63d&#Q(g! wZ D佖豉B i`9Cˮg*3tKѠ$nhIe͞iIC7MgL-T\zw=U}ڕ.K:֙ȝM]{7fF{RȀ]Lp=Zкj]֭ww9tf2ݫYu-KhYwknw-~w>Vء畖Ck,eσlVly7ꞎfFeZlLiڵ[KR-e4ԉ~PPWlo7)Jn?ViZB+>b=`c-=2Kn^9fjv`T]7],z݌@.zUXyYH7,$+7}*a< Zٕ/Fb~4a9t=EgϤx bw!] l0v']4vcwI՞J*m ;lm\\#~~0+,M曳䛓U=y$dA;/<-sW]KFZbʚ8T2_>_V;šsòW/2S<2S,C'uԲj>NRzJ6~DmqwYʝpgIwXPfń!c~2_ŠNOLe=2{ ;L Z"h$Cךx!`r+zzAtM;Bedu(Z-\ܭx\gtۓ7'w9-tOУ?{F{ڠq;h|19|{9#ЫG /wuXgWԭETzSMtzɑ). S-'g=R!+v@Ch9,xl*/ pAb-|X[#d9!H1\*%JɥR%KR)TJ.KR)ٯK

(v q,.?*ҕ")9IE$s2J`89$!H&^< ʏՆJ$km H.jO)dp^}j0c?sqI߻@IƊ.`kW w/yD.ìXÄ'$I8wQYޥF7ƅJ>iϋ.IuV|ESɣ^l=M(vP0E]\JA)L.3ao%P0 vVt+P& -,*r1bAvt60%1/Vc/L1 $C^ wō|`e_7$ѫWKS +ujC6o0$]q*$L.TsXӼuQ7/>!Z~|_=vv5*8O xe[\OQ.굥z+"jo ]25Cm$[Gb|HmÐamfu h\(XIGûf1׃ٮ):*AGm&6j\%uB#i`ĥo b/.=9rvt*`Ol0npsO׷߾o_~~_߼A V`(V$r 7?vӾ񚡩b]}z]}NerW-tk@^xf0u)dMLt&Ԛ1t#_+@6u]6b0-TyUmPܧՂ*w+C‹:@Z1vHha}  )&)Eɿeom$A#57@F|>G4,y$Uj؃H0ZdXY44rcN/;mپv6'֜ xgʽy߃1;g]8Un.ژt.ڂr#L?yf0Ms\9S &c0q֖}H(򥖜N)bxЄٷWuz_tYG>7bey<[che~AH9HysQ 9l-*+zmbMc*iE*:n|;nC14P*K꣙>T`(ֿwsMwѻ]$ڷG˻* qx~ Ղl:x*͈g;!kr2ijf/~Z~|~o{@:!rFA U`hi)1HWg9(@sķӬ+B.F%D@ҪH :Dĕ%2ze"A!ń5]t9~91#Gs}!7m͇l|֦ؗ>h`B:fb#I_vq>h,f=XX0 D7EIJFVOUI HVW%yEddМe8aJL []$A e,%ZIW¼z ;lM/=ϫ4CVP!vfI@f~>AEt[ n/CXXV:3<@Ă)1Z:/%u3b˸s6#ahe[^ 4|6#1A}/6dM:|l[dm6 &U!:J ӧlcA8'ql(`HW4EV ?<=?iAʌ~krf9kсfUi&|7紽|V ]}ͻGE&妟9ʀ3[sM>@X#ɾs=Дc$yuK0I1H@!|ڦ\ϸpasp:޼:0ځ`t3ui`4, ֑ܲX{7Lad.q`(ZXA =+5rQvcMmN'狛ژ9 %& 뙱1$&Iē\x=qh})J*:=6:o]m]]4cx;Bep}uvqZc5#y6Ç442)s ˅$Gât0ME ͷ bMgg!7#)6<8$Y"(WARbG/nwdD'㼣o3Bx.4˾HU')IBVn02$R^Z抡w{v0ηo%zvk3ZB_ۛ&rdz}L7JtO+MWؿRS4ܼ7ZUC"a@roӢrxJ#L4T|ʴY6f?qNyC@1rݜY7ak7*`w4Q;E ^7leMn%Q WkewƂ1at/ Yp:9m2m%a%)ICbJ5R31׬7wVWo%|YAA} c@yoפe\sda6w\wltvZnmScs^jWK$(Ґlb]zb3yLߊ|{?5RP!Bx1E[,w}w]ZKkv6R`sG_Kɣ p+ɲ9bofx MBQwyU6Q̽(byQFI>e!ϴKeƻԤ$/ͻT[paauw,|={Yth}{>h!]" ۛff?<ρ=άGOh0x`? ӫzk&_rnSͧ~uEU㌨)g}VhX&;llz.Y>8Sd%YM7'7DĤp !GrM 1A)FRX \rD,ء.~3G-i~G#`rNEE rVK:n֊(<YMG˒OJMjG"qomNof|V˯{B>WǷw<~=WUR-ll^0b:HR< '%Hq7"KhiSL2c$CPh҆ .3<&B+LҾ=2PƄp©Iē\x=dʂuP%fݝ4?ɈnKes?1=E غ k:z* B^P:x6#Ds`d nJZPd+Nϲc""q2 񘻠 6KsQq ygLXuJ-ˈӉaMLXhGU &=@Ŕ( Փۃy8}_ ǣ.cVDGAgqs἖q|݉EvxMǯ-Ƽ,q ̲ݛ=Be3q<]x)r-,A9 JTP9p;Cek |ǭN6FGτV()5DL!O(D fV,ΏD?r-Žr}>z 3w})бf=Zy*jٖ] A#A=I4clC -HS 6MFi(bUB$uЮ+oOx/1H8^.Q[䞃^"^|8/deRj~qx5&]֙w'T.JDPl#5E $L09N־ :Uc] WϪQ"TRBrg:$F(1X"^ q4tJܪ$"8+Է/Eu9,_6,>`RedA `%#Hs ;Ƀ&k(7|L+݆ X=;9iaIHeӂp@ĠL YTL!b/iKеD@tTֶa9uv삭'iDYlhb*FF-BX:%b8@EVOE!ap&LI8FҎY | \KXGg-j3>?Q3Q 9%=ZcJijzmE%%ҥauHl!ڙϙ%}$^ߗ"[撟/yD$~*JH8YާF ;·W3*^=vTw S֓G{5NuV` $"aRL݅í?\' o/B_{տJ`pVt+P&K.*'3,f{W8h$x;_l>!3 wէi$^1-v֗5jCV{4Nahh֊S>z7W amLY<cWy׋cb lqk>߻).8Oo`60!T7#1~ۺaH0,o> LVPMx>f7pJ׎JQY%:N Hc8r1ȯ%%ٰ܆x$ǠSІNn/o~L|ϟ?yz_{ 30 .%?=  ׿6Ӷ񪡩bMm>t›|qÇ9m@~p:vҦ|,Ąd>WlmY7Ϊ3mUX ЧՌ*w @x"P-h#=');G_D]Kt ̢_IyW6ĚE:CJ|m4,y$W؃Hp2fSz`֧Ê*@-4\q1eN+;0{&6lh' pюm'7}ΉW*3WW4?g ueVθ>-.axP",ۛUy~/{waa^(giA@"JQPͭ$G~[eSUigmcOWojI坦USZeFv6n>v#u}| }svIFK+S^}#)+&4^ȹEv9ד6Six`,(3xn23S\dzWYny鏍=mw'ܜKc=hXDGX[gN qmyB`|Y[?\KKZpw2.QĐyC^2̹'u=$SLr^>OsA+x ^P?5w&G8Oӡݠ}=\瀎Xܭqj%9C`Ugj9HT/iBni˧{3V{k`ܵa+f*6Ĝ,s͙6J/;vvе@*q9Ȉ/6?i9N rH# be}0z)#"b0+CʘtTwzsk ˻Q9m1*UbkXTr>73v||pzU=]ҷt=]w^9IPLR6wz}@u1guOa:-lسgZ2rkn7M獷w>~kBL5gn [!_N{xn7fUӦe w[9'Gy/ ;d3Idφ xwdŗϬ|gTc"y-+Ǵ1B~ I˶yQaw9s0UѸ~^_'asU@9{|ɍl~Ua>y6IvpΔ bw&]bjebf}D2pH]kK|Oo :t9I`"{&CDTЂ)RRJnbi1g;JaظV[*@;~?As7 ܐ hAzl:|y_X%sQ[5O:`05UJEù+;L ܱu 8@EV ap&LI8FҎY | \K0NGg-0hH3>?s3 J@rXZaǠ*Kf6TOk^RYWi"i<Z\4$!ID a4u\z=a!թ| @%x.?N!B+*|Ne+yq1&Y?dC;S~9] $+<.3{B$y \ږH0Ɇ _<($ɾS5}j03.|{|0cKupuE/yj=yي1bޕ6#"6~LucƲr͟`*uZ)ԑ%>VRd` +0/0E]HA)L&&SPWa%P0 8+: C(XR8I r,H}(IÛpXX_G3a\SZ%XU L`:]r!o08||6%?,T9L[Sպ7k"g/Vsb+lqoqп-˵]q {L|x ;Bu# ^;Gh0чQiN> =YٿOwMNQ =|uUVNrI 9_Gc~W7EM1#S<Ǭp~~/??&û}/? 48jIo&o#@gCo͇ƴaT&C6g]-xquSpҠZ ~{ԟn~~6S|kΎ6C Pқ-U }Z-rT! 0@_xqNF6Vm6:/VEn( rib|`Yk` )F4XsÑqM0(JG8u^Qs:)֝%{P F+B5+<0j|FxLYة+G֦blh'֜ dg`³ Wmx+6-ʵN9[pUNCԫ2ڛTd^5f`e@E`TbH$CD\TOEHC0δy:4'&e蜒h"* 弼!ec -_;T„tj xP:z Ȯ/P (a,o޾ٝj6eLj w5flپ9}`M*:eSLUE -c1'nZ0tx ,U,f`rHx*B\l|>Z)ulHe*8ʝ7\ D8P=d:FDz4‚Sg?ȵ xpYk9Lql0֗ϫkLϋɮm53;cy07OʖYLH \UhT V0/i :e%G) K配QS'4b5X)<ID0EbEOLN779pWgȵZWOm&Eu1`c>`UXщX wM_&$uD`@ H>)`byfq~3[8N`M`5l+7`8;]fu"C`ug 'r)JaDOT\ c ?siCfj< 6,B/a@K YEx*_&tkc&]0(Pʍ"6ytdXDcZ)"V?"D4lؽѰdrݾ޼vvn1qc囵v-&ѷ_nF{vmظDKTlD東q\r6nsq+xq/eQ%KɩΨ++VJq T:.Q]qբ`M:\]QWZz*Q)PVWW8fH71CoG_3+Q_B,⒎N0y~~61Ŗ( (ƬSaIbmMWܘyeWQ'e?^s @ uhMm&rugV}MTw m ":`pgU"P-D-箮+)qY]]Rb;`hgU"wF]%jGJ/Օ\uVAW%QW@>t*QEVW$SYxvS,O:=bηgm Q6gO~2SO 0PX:C$xν ĠcHn"̼!b@- R( xL.1xL.m{ӆ^!xLFFxL.1xL.sHi;I>BU'NޏCZ?E%Ag{h?IcExz_99WX{919/Y9+@%Pv_Ċ* `¢` 漍cDe4#TJ|1їg,K*JBXYL^jʈhA #(H8H7:ø-WTypkXO.v0YJf?L?~l _ך,9y c&PfTetۨQ1,%! j@ڢC Yڪ̲D5h45Yg9-.w<ʿ[2z_-X'q f+l {LDJlEf+"-j&>g+me&#QHYu2LwZ D佖豉x ih"-sƮ t0!K7?4d=<}SڄɜgAM&p_$b}ܠ-yףi'ZV݅ix0 t:z&W@s1)h]@ru1k{XNmvEݽeвEŭͻݵz^6~*mA+-wC2%?Yι σnyij+;i:`:ƬkΧ?okשecz|Ў'`&bfK4>FpE^Rwd/|gT( cژl_O?붦70C3kseis@`v'QNƣOӸA1$e*EهKUi| I1WAȋۣ:3e}]hYZ5Έ|Ѽnp/YekakӮAC~~s]&Y ,Rg2DOȘ"k/%%D )XZYh6 i) z?Rz˟&m7&uRz`Ǥj3SrY/%ΒE2`f<`nXYBkĂBt exS K7ݽvVQ{mm}y9-m$U]*$ksr.^upI'4_+otc5B*z3<9^3Iwǁyj7boi*l։6IF_E_ؿ޽49(CQq88i7zryD?}wts krP-g/hJ7`nNsOWR၎?j:S|Ʊt-BқBn EbaYFҶ,(=]yz:c麥KNEKw7dJ3iZhFZõ!rB,3cxP) :=V)n[EĖP&CJf/hliJS3ϽaVn]>mr_br :~uï_;tX:@?h2ݠu,+ALƫ«Zqҡv*@ W@>DUJ*Ux fH2~boϘ l}&mi1j f.llp9 nbXxSo{Vx{ey$=Hd!PyOTFg\tồI{;oSrv@,M ר]QQ8ҒPNbXXH6vSUyl";uU^>P2zEP.G 1X"Nq飶ěU`+ĵ">kE1krHfVr M+W$2$?yښ 87`Jk5&$5:b=7ٻ6r$Ugw6C. 0lfv I-E-'vWn=nYR "dUWUEJDc RTbu?ftF(mMPéԙj|C`ʉ|$*Kdgѥ.VW\C/WH>)=w,pbo{}WZ$~dw : V0b<y~lMonrJ7JQ'lq~OG-[?80`Ůo _!s|AN9Bk:VqW߽y/|GL7_ gp05L¿F~7?m?4m Mm3hso3ns >g^lzDۻ׹{+ptfėNSJu]&_Ao^LrJDHVSWq@/< V6m6"}6k^D,ȥEKsI fQ\VftYHb G&mAKKz^#H [kXIW؁J0ZhXYwZhcʺ;i*G 66lh' hKW\7wM]< wDgImXRwO:|K}I^^@AWpe9(VmTtZqV/9J;Hu HRT r'3 :;UHyCSKIFD殃R-AGEo;P}z b⼕,CG#iUHy2 kV=O{lL; rttW@Ds%\c_%u6)_K=}lPK]R@$y8)uꤎ%&ZÈ7zRwlφkP{W-ڝj^0a~ϮFxn_^Wae\\WݴT:JMt@,, Pږu{ OGA~Cb~ - 8>>YL\i&X ,Hk6YNe&o <Teɔw`0OCJeea +EPƝ͐9A aށ&+`و*m@%|T_z=ȟ;#ph]?^orV)/9:~_OFh3v8ЪփESUu}XRZT )d2|ʂJsa YUUɑCUrtPX={4"5E'<7 ۃ"Xt`"+ɮ@-1Ȍϝ7<~l^3ZC,"SMYl#*;`RK|[v -׈xy`+1qUh[ܴ d.5㫺6{/O/.Әр(0Zl&dY'Q4چsXb3 0 ϷpC:cYp>7t]w=>4͕K1sȵU"uDе9Zg@Bē3r'uZn1w|`qm]z5n#}\M&O7|σn YwNy?72qkN?i5a^oms"m hPiف KI$ۿ`b妜4<ɶ3M*eRbR?+Zo4~ہ(j)19mZL#`TbHx$ )}@\Y"S&!\L\̴8;TJ̈́>۫0ɪ*%xìs#\m:᥁ 1-* 9CQ@3 fp2hBi[1c&;mExk-.>lJS|+ F f%dmgʾтй K3c6ոĩIT|w<4w97zxw)ޓJ$>yȓ5H5}͘HջsN),ZcuJ3KE҆16ԭ-im$fAcK7Ϸ*sV å_IrT[EϺ{kn޴z+׎o-T_~x3(Mq^kqFDPLqSYHYtj9} +#Vzy[nW*Xi͡8@)/ZM!hRZ !`up!,8ZIa/覗7Ŋe򻡗? |Z_|2־+ I.rd &Z{w$߆E.u|wIW{k~?终OD+%;͉:-&} /6Ipc0]YĒ d+>NYzp*\ ;?8;_ȁ|!+m\-L XHwZvc|R8tʬ?{WƑ$2R߇cg{Aî!)ѦHGly-113Uuu!J k]XpW'wZ4@lfsY9Ί't#-\L銷,] -1]-BWg~1i׏~ +TߦCt| -,Q3' qf%*Ű[)rUe9qgxڰGR. GkQr+ |%>jK ^Ȍ ^`A ƜB\%bT֡FPpzb8JNA nn/Mn?Chd@aBD%4Ez* %^FHHLBDN*tf}BY1D )f  D9k-!\0G$-oN#c=! #$$8XG!1DXҠ8KFbRD2?HZ#iRj-a; b»Z붒h6ݸHyQn4@I؇}f {O8)8OG^WޚQhuZ_ q~MQ?lC.ATq|}z;j-sJ䢠L V2hBΊqiTŋ;'ɏUP…* m0J{wIōϖ+*rda-*tXuR^*>Xҳ;vd&wICA|ϡ/HĨ< }I)'A,/g%u1kvȺv ~kqI;EIƲH-<"au*p?&ɾc5Sz( àg\<`ٿ?'].;+wSwG-]?H,vɗF"rCRP;5Aih||o]  ΊNy ք*ΥCQra1%EԹH\:D>tt~|fNRl,} 7Ydd_{CA69[P%@R}B`U~971\kmE4v KD,ȥE:HI fQ򯁁+ZIHb G6PڻaB|N5,y$؃J0ZdXOgUZx4rcʚJ6va8mɾ+C1l' hCW\„[;%<=xcwѕWr|)mc/+c/:b0UMaB.X,Co q zJ:l &c~]E'q怭H(򹖜N)bxЄdqz :+94O*1Tp}@B6a <@"cmqns+Q଺̇>TsB-Ki4TU;$|v]!xhEݷ~M6r,ĵLY9l%XARG" 4KM-a$EV 12HΦ`ǭAnib9Jݒ 5.j>>W(s|BbxSs oRQwӏ`&μ 9W%.rXJc1F1sɭ`)+U=, ,6ZYTS'4b5fC#)Jv>09öetLճqhw9rxXlGMF'' /U?>eyűAGBG kJ2 ^J:Rjsfce#= 4TˀVد& ?ش k`nk06H@H %Czaܴa\B?gby sR )(W+L7'oٛ{oSX:C$xν ^8A` 0p z` 1G Em2zudXDcZ)"V"D4OsiY FVvimpfox5ެUv[7ݽ6@Їcw;:~\_w;Eɛ0dY$?pf6O>/fw?6 WÄ7޽Nl (t#xWqτצل-YLD-T;S/Sj]P=SWږ<*8qna *a&}TY7:qD OP "Ev+FiBTT< I jT ,dYm9%r K5D-^)QHSTsN=?ERpD@abyXu0jvPF屩+uuWY-Ѱ.Ӟ L.3Sd,=*N b\ӧ%C4s xɄ/EdA)vyL|-v6+o}E h_ \ͣF(̰`b8Ӗ0J"ɣ1*4f޼/ygEX`N\f"mCYѝs; kIݰh5Z֍?BH-@0@jNzɫnN2~ tf~ 4Q:`kE]cWWJ,u UT:DkU*Q]UVcWWJu S!Y#uQW+ JT]=GuősnOU:xoޙ7:T\62)Ы'j' >z=m|m?1?Qw_lTaIHfw;ɋ7&3I-tePx%iV|_d%81VAſx=jr:$ǫΧې:oFmn҅NjRFM&h*;2Zj-y:%3p(l5m!; S+?0Q>,4w{z~t۳(JRrz&4V~1/OX662{L,c0D5; UTsc%bD~3E4P!)nb&ؔ01I\BԜ,V8[3e}Kԙw=;[ݴjCE_zuhL~ pB l bn?_]"*e&V °p5ۍG# =xDۑM(ZDƕfbmkCXf"%'GgST#Vjwt4D{S(sVAdACzgi‚$ 3Bs4V P9D̗zc|0^^&o? ]SZґSou}T3іB dǫJ `yUJ\i J ۆAʯ.%@Bƅ<r=ϵv(K9$#*B/Q-w4_Yޟu1`n+aX:E(efYaccIt1r,t8X,ɕfZ${VFBtko|qn`z 1L3T=x6 x fit}g("hrkC2(؜9oQ][=;I46 ?MQ<~q?om1g-Vk8|fp4*k4 TVleF͝QFGR-:d;Y#QF uøšݫ-?sKr}QXkl<ӳ-#s ׽) A$CIo(YUy9dY1ۂKJ퐭Jz}Gyf2 ]g$WØWק!J޽N$l^ K=b{țMҋ"kV\ktaq':qsfm('{LDJ%ӸdKXJy~up ApHYu2LwhX$"﵌FMͭ{6cʖĮdɹV^9b,KT9I $*zE,$$ \ٙ+UlZEz^0~5n$~aP.|6fUFP܇lQ)Rc۝s&ƨl 1ʼuJQ#~xq(H0Knn! S=|sGJa zh6Vm(ʪ-TO[RL$ fOϚpٲ)NzxbX8z˫J.NHZmևHvMWv/^=2fNʘ4_-n7=t%q*|#g-=qg{i[o&{U]ygƷǫ"b%TqI<m({NҒN@tCעi?qҁoqrK2 y+,bgQDr9Qb/#S$]v |HVJt"̀. YD&\Y!,W"Y/U.<(pADTnVMWCۖ\nϛ*xC:~._SHz|07~]d4- 3o$!YA۵{eKN2v|3z=$>_y߱;G%DtM9[K2}TbIs4K _T;i1dΙ8W֓r,C>`o{ծOk|R&)%L,jgE!HRpe_(?Zx9eMTrWO?vQP2&H k.(qCvY@iI(J1,,VJ\x(J]Z<ӝBh-Jne cDGm7 ,30W넬9S=ql(_L8d/f8hdaT'(%§$* %^FPLґ0хT]5۳(,r!c1S@kv4H &$Z Bx`#JtSw9=4јV!!(3uS/N Ij$f.NtAc ўzJ)mvQxW,>b(Zk5'B _24#9 D%3 q͡;XgyG.aEpyB};c#^h=}h!pnISsA>)4UJEù';LGܱuGAqSb "r?R/E:M)pZFe)A@ GoKs<ߠ /U;uy5Ux-Fntg$A2p9VXf կەz<ĖJU;9nؖ y"i<Z &a,HG%Hb$J3 8;hT| A7tPbs0R")9IE$s2JaQiƝVc $;) /={FUXsk+(@Y*R/}r^q(AH@y1(}D>$eGgAT !ǀTIyQXc:X@5[̱S5'Ӟ%[4iG(mMPé.'12#Z@}$1Kdѥ+m #>llϿF{\w' XV }j!k.G<̲j3Lxd_fL1L _%4W]K=E?_f(b5c23; B#L9F'RP;1iۓ IzW fErkB!fp1bvt>aQwql|Fc; ٵ)IJԿUtvu}(Ǵ8PX('U+8U&?7 +.IB*0wʜY<]kŷ7e.0 %28_kKbmEjnfŅf4_a}jIƑL$x4zsp5mҍ}zM6UQ!4>O\f +f#~Op6.!1#S8{`TO}&{L},th~}j?4] CS -um]Naܻ v沵 Je?|vL] 8}]_HK?i!5*$+| d~ULN[֠j QC]VTji;P)v:9iymm+Xq6wNlKt;UEXy`rc0ڤSPaz&RˮS9t(P\+sc̱|6vE{0u9V$g\Kr1#=kp94qxpr~WI`OZٔӍOR!rFg `hn)1H)h{U#UAm\ }6<`6/`e@E@TbH$CD\ٔ@HC0T󴷪. nqa ɜ9/5WZ4zvSr jۏJYmA+ Ec:DC3^x54F,Q1=vB%̝V)-7yj/ih+o;QeDTaG&뮮ƣ"+0q]P|''_Au!J"?7g@ֳAwr*Mpn%9*KZQ0Z ܛ?btY@w 6CNroQԠ;6dBvd܉7k;!XrQNNџfEr.F~Y6 ۳;yujzNucbJ#P R9:(9oW-P%mh_O]E?^U$yJSMJnYBT kS'V@pIg{]}g RR}嫊V 0qm~}) `-0KJokUĶjRkEiS:Bα}un\x&%5PX(iθ820Gh@&pєLRue e _`HDUʣaW \Ə]%h>tvgW/]iEîʫ<ۑq<ξ7BNj8W@`gVV"r8KZxxj-଻\N?wV7Pa5(i/`s}X9,Ƙ[cuSH,i8%STHLjj k1GB;  `^)je.ۨpAFƢwqP፠EsCqKD0^3$42:BU#Na"q  J"Xa-cTX`1˪W܇%X-#%(9/F-^ w)[`BrƙQFrK\%}D Y{u`˷uZNk_򹪓ܿc(LqaHu%w]![Ae(H&Am:z'b}!9.Zer;:ّx`f!̡e]3}ۖNϋƻ; =/܎xK^snFgshTZ6sԬ_7m?!.v6OJ=JGVm➑Ĉvn7}='ցF> 3*sF < cY/ TJ bkxYk}Os>;"m~5nt=דۂ:5bM׳vro.7^K%ՋK>J0Knnvt|O ߜ6..UP6Vm(ʪxOOAS '#5˺:}f≉bk1L. n+;I^"ky`wX޷#iK4]5"Jz.wH\nsEݬ_IŭVtg,b Tcx}G(j#<'mU=T] *ED3hU1M$e^Žq[UiIX' Nzr2tY&/F'#ގ'F_]>/'D'#+.p-e`?%f$a #I*~3$ER")( 9]Uwg3a5[,w1Q&^ ifU̞Yw:N܃<9'Kq" ޻`l DJ}F9pJNKŨr֦K(M%Ȍ>Ws0>knoW9c/\mėwaxwwBɒ{qJ4US6Y1^=^`WpRTwAir~q=o u!r"$JTA=AUUPz"B]fo /2 !]֥H^gmm2d !7eQ;/=.X+jt]lWɂ!ާ\JhiBY3FmW5ۍ^#G,])o{k%v+iV+ HbVaӧ㋽ wcZ\L28lR֜ˠCnˁ6xi08j7*=s6yd)[/+YL6y -}oJ!.q?E'\fRd:6}J{=i[FEM1@FE+=1qz}8>;&t'^}"X tzQw{k+-X5KWq+9]yhzo~^G_)s uΞYE{ $h禐'<o %pVoTpf]RRYZٔ?w| P*e e61T wD15wTKPY] `DiHq.JT{5ɦL\h\4PT'iEp- y(2q<Xq>T{Ip^k1W/m ˻S1Ƞto=*+9'h \TQg˅$ύrN1cVw^NTU:?rߴ0𨞳bW4 o0$o H*)E2! 堅 B h̲uq>99lLH7ͺCh%ժN¨{Y*]qvpU s/oW$uF,$x .Er,-=ϗϜ=޲Wzw%K}dy o5}{*?VUSasCk: Ŭphf݉5P&,8ҿ6)D# RdE9΃:fZC B/QKM0xN$ 2Xe15Jz&YD. 䖉em{H:.NK愓Bېp<O9YHO/’֕4.妴 lg_ KTC,X6p?6W|"rmQ/cbu>r1Ş me{{]%Hf+_UݻFK;`-EN!jRtjϊ)R[y٧{NjuXfz1~= 0DWszm_%Cig ;HÿM#Յ6rHWMahfX~AFi8d`Yr/܏ٿOr}K6M!y ,|6 ʯ+FÇJJ~0~ߟp<xԬ[6r؋഑ͯ &IoJWokBf1p+bC<4م7c''RAm5Uh EIXPjOm҂) @PA21Tdg,J' ]rS^0ybc:y6Z&YD,aE²( |,9&$<~4K#cc$Mo|cllh&V tGSRm yJs}{gt1 G=kw̴*73t jڀ0Hzob7u=P^t"AQOЌ? ԃjD;p;2FV0 )ZMJ28AtH9=<1{OJka]4Ka[ YlNcBkq"]1<֥1N&h|/N&Hc7?!\ː_:J/͇`}}4Ҷ}F ɒi`t"7LDZd d|ۨ/}L7  D\4ӆw@h?E'x=/J!{od33㌉OX/*q^Dd3(]Q[\eƭxrF$f"EmԶV~ gN/엘xaBoS%ǗMWOrԾnv{VP/ӋHi{ n^T~FrUt|7/t5/mTϾbXĈF-27y,=)z=yt^ ]gXOS;uԼץhi3|\&gS5 P\E\"bs*v+>E "mp&AǛ%ɉSե]s mSap*9;eŠLG2]MOog)(:~}q-U3\@U Diބv2db1Jq+`d<r^:䐽XaDQ샄l`@rϖA [u$ļs;\߂6*jB6*Z陎;o|݁ښ"BYiz["_u[3f?^o=;e@c|K$Hu0 4%孔EmD 8#LJ1p!Qg ViZCY J{u .֮ZdBo\7׋ԻWTJKao+'-1Uigq+)]Olzo~^GÆҺ}3NLvONjo:,k)޷GA:.h/!Ud╆x6(q rB;REEokkn8$9N8]yJ5W@(ǫ5!cnN:I$퐒j ;P,,*jU ItRZM.nDq[|܂DI @Pl ŭҒd}uwIOm^p[٫-^ Sd"y;b) 1ô aVS:y&zI)RHՊgE}-=D]K9&<* DSZm \c4}.LU؋V~ja c$:(ޓZhrby?šZeR!,qޱIz7kk#HS:[Iɵ1B)fR##)ajYAzV=I%PvJˬ`Ґ>ĕHLP\ iV5{˝ Fn ~;㒿w)VHajk.G$gh>eM'4Iϩ9?AYc_>i?ۺW S y<"Oo'aaLf4gZ )L & Ӌ1R1N]Lgo.o-q8+y8:sP+K8Kr,H߿/ҽ\7"-W#b0~.y ]oѽf_ wݿW~..p;.?x  Ya?L|myXGYd;9UwN\lMmhm4*A q8*t`^żBOٿv=6wx]YugU|2o,}><We% I;&'{?vo|7?S巟o۟'XYA% $exk[ԭ~[3Z6׹i+X|y Zcz~a~O}8{ȃ˧?a|9aWlyr[T}*>T0O < P.+~lGxi#g&Wn( E%+fQ/ɜK4V`hhqg(ys:y6Z>9MR*DM1C! GBΑ1D`hZxnjw;LvBmz)Xs6G'sx2\ Go;a0T2isP\0& M%s4MTc1LlEg8 G^kjE4[Uc:6sEOhUoE_5xڧ4|\{VA@*RK2btbXGz*&=Ŕ( QG9Q<1C H𸦆BqڄsDD]^YN΢9HgvcxR=(2&zx]+ ~]NwbH9u,HəL<&F &aDrf耥4Xꓕj,#Β^m+6+vqiJFLy0TJJtTx&:45I9A5-7_gœĄYjf#!fMD1eE>v暦f[*h, tN^Dv@qEjnlqTl>ߤwl *7 #9A (8C_k&+@ eF150RH(<?oa+#-4柫jf5f.aFeImN2M ⃡nb%, vYGNI7jD|!8`q^A i'T,A1 1Ol( 뙱1$FIē\x=qh}.Ǫ8Xj:ư1r;699Yv>8gg fMaw;i&AA cCѧ!ZIZЖJx`)n<\ $ Qx"RB1;YENWx>~a/{yY .|Q%γ{Xs"K$J WX>6wbr <D?"ۗ9}9^2Y^5ʒHt㥳Z)ǒiE%.d%*+yܭ$>hl9._nj=]Odjv-kd>MTfB"FqV9+ J /+'Jpf69Ή%'Y&Yjka#,xcQ&5.|[iٜN ȩC;gfK|2]qy6\"/$P}tql {mȖD%]0UU0Aj3u篯 @ՔGd 1#D@ꁠ*=sJ5!je=6-šh%).7K(۳^^Yo4[Ӈy $@~֋O̒cCp]{9NFt;R̦(=6EF utD`l=`ZS=kX[sd s7_kXzIcvE۽ń0HTrkiw-<~1MWw;kσnkYNgNvݍ^o45Øcz|r'OCU;Od鋃S#'~0ġʧYNHv?oh~t3uqmM0hs^`v ɿZ?@UV'W?0,4o2 7?6mQ;U~^m.ϭi`v$\ VVtA Nn:9K9X M\V[4k!GWy#;s)ʾm+"wE2;taYwZ!W'N(,N&Tuw&#I[jӰ8OdCm:.t9p(9i xu9pA'P+n-[Xi:Iz'gtg~LLRXZt5"2P?٤;Ռ[Ldi*l֙&I^QsEUiwp8mqҀoq́]A'EuICKH4矽6*=GCr~u"d~;rd:E}YWkzv.zE(g{?V6`4 # #4p'υAWჼzƒ`J|0}gYˏ0L ӛ }+mo?oe\ʪx_i[_~" O{lΠDuS~EF`Mk*frnKLF3tF_aQIE f=3E]ej>wuTSWP])bhSLQW\ڢZs]moG+.Mq98$,8!E*d[^"%ȡH <3TU?U]]*KxqWߠҌi䮲pk!vm]eiwwTo]&3X{ f]eq<wgRrYշ Uɼg p??xOkX\+^%|g3+gAThTy၉Nx/9$Ǩxg1QGnh`F6-tn.uYl-/(^W!i]`TŤB̐пyE+@B X "`MI8-&i<~],x&C`ٖx۫k+UpJx_C.)\=;j9bx%#~I!#1 'ebP!PKD8 B̰*VP8p|/phkCph¡]8 vi \8[+TO,|xeBJ[m*-RBJ[Hi )m2ۉ29azsO]עH^ۉ^Tbs)%z]"z]X uRSX {oa.nz]u]z] ^z]X ua.ׅ^z]X ua.ׅ^PX ua.5 ua.5-ԨkDk>466#WV* [Za*YNA)ORytwl$_K7#dڀrQQ#Cr82clbZGTπ''9ou#I@ZXh.P፧:c\k\ "8O` yjy#ykRJS=D>'# yz3aSsYWlluܶWwW)!=B)z]~i͕4w諆pmJgF;GxF;c>|qzqx|4Yr߾I >< |Ʒ|c͑[G^gK#j=z,7%ll5@aHG%E2>9.h-D<ڍˆk T: x"ڷs&&p2[&[{̞vrګZsD >M)CKkB2N AR< (yƬRb]O(%^]g`-9bȏ9MzϽp wqK@3A%DD02W p'-~̲/’z8R@yThL!/YdKku!@<بl+@HƊ 3" @85I*ʃ!༆(G=Y[E3 [-V,X6--l Fb7HXe+إtaG T(~DV ئ <1N3H^T@lԲ #doo=_^)WE Wwg,*C qcqF)gEK ZdxoUyNxH"$^RNc ])"B3H#ଶq. pwA!Z^lM z7S.Ypjő ~j4b L>dQo8Nr-ējT$qce\@HH,%{ Fq:I#9$x]]z:aK2xddJ <(x`!i(U(~hQr-IpOZrrdžty#+XWRE$ɥ:qt8wQ:ID} ;QMmiC7"ёںLJj(C Z Ф #RTc8D}=GI%tvJP˭Ҳ>J$.s."|,;KnZ\Pu3!ꃩΖ\bg_N3.sGuЎBja}h.rF4^ukZq|aDOkl KߎsfWL{Jh{>v95ʜpD599E%vtINrdGxz7G+tGgZh5+y'$:sPq)P97tIퟞd-,>tt~Cvs{}o:v$Q‘}3jOt||o|^d#Ux?hkdh5gy|xr j3L;9͈u7g'^_gJg~S{z֌-7c;(BUryݙOoEwb@<|Eao*4GQ3߻ ݋ղf]Y~C.gU'6o,P}>#_3 e{~ TK%?/߽xWisu@ӹ6 ?v~|.8y7Wly9C'L>b_r^lJ/BB4bw+(1Hhiٰx rDũAq r~ $|̹Ģ!a@3Wu{-ܰMRр.MMs0øS#H"wxiZxN+ ; >E}6=9ЖN%ȊVlxV߷6uwtUya1ӝEMRô&N9k&/%tiGNc:lBHࡺ^b:޹ P%Z3+a`E@֪$5[V'PhI:ò9:kF N`D)/߼3&H|2Qu=|j Ŵ,v#]B]V߹l58t?\9Y}ѫo?e(QրW:7ޟ!#>Ui>U(ּi\TZ ծ ,|tW)~~0 OYر!Uͫ;_կx}N8],gPl|7̋G;6ON3o&7Τ _G=tSan: ^z}Amܠz9|\_oW"9GOï"p&RW-g0E#BZڀ+HwX_[ne!: Y(':.S61 A-f0V[cٱ8''au&L.S|KLQ5&x<0|Hϗl lXюБ};(F'q/VzW"R&s* %b`SnY 2z1ɯyE~3`->W.iU` Ԯ߁Q^ox`೓mAZAZDs䡺WYFa8kb%,xVLkY*𧓞80bD=w=[2 Bp9%q^A YS*g3 h'R Aznl $ rkP IQ9"<]՚8'z}J>_#-}jyy8ʒlhkr;7=< #lg_l;9V=ۏDilЊ( 22w*tY ̔|O`P>/&5Gl+\#k>F< B@N |ng$7LSaG$XiErt wm_KlF0`,wm_'o!)1HlK_DR45qY4{U]]-~f[`Dn;ͪ>cf8IIp x$ :_aX1c2bThnD4o_YHx,p,x,R?i816D )aHi(P`zm5ye50ͦbHjXPD|0arroD5J)rp5,\i} SSclUSy5*UIf_["zθ=SW?|~*'jN"h-Jw `9ťoW12#c#5+[y+]Jׄ4m򖮸;P>,|i+ݍ\8p r?w;qʼF&or` iB8Zke10BB23uS/IO'Ksg H\ 2iBV[Ik%]JIKbrjy_`MNA}K̈́lP{lq` )9j> WIPbyy#8x0q J\2c08:bw;-=O<ē*/)ܟz |lBV9 $R* |t\l}D'0!,2\7zRZmB Oc*5(K)µzqylcl#1I.4ѯOdt(VXQvV7?kW1q7Ŋ2ףwFhN,@R0 cA:*GG#QBMqOXT^I`^&jTvbF*#RhtL*"JF 3âҌ;$!ʓ=x k`(`ܫ e":QZX7b U_*y,DҴ!Ee"JQIk ?빖5m"Hc@*`S: kLIju)R#{)n<%^s1mYA& @&NrlTjZ߇rb IY9]W꿕@04W`MH\:$UXN.fXLw wYܼL\zD>du7>d]|wZ%Xfkx>{v1̕1-N)ۨ Ylj*djtR>Bs|ͩj͓Y~-._Wgaϰōn\[k;(V|;.nF6vC9ChHe6#1|UÐ(rYfyC_(Xi'wBcvn:uSzQWUQG!2>]v+f#~q/!1#S<{YW>\gp o|wO_^\o/_zw|^8tx% $4x0?^И654WMdh䬫d\WP|rzavdo-Ϸ_^OKGwu{q&Xŝ_AoʴQV_+U }5ق*w ClV_Ӹ84kcםg&GrCFXKt ̢_IyW\6ĚmAwCB|p7 IW>%AV83j)Vy`֧*@-4\q`Q]c<-:-I[]x+4LNW3-L:5-Um Ӂb4t4 &$77^!}vV_}6:z\d`ϲu~ ,+`6.0E ^l<_'(+:tUīor6Sc -GiLzߍ f,{gٳo-{08rvSشlBvTaNE`[TpJ>*!VI/o=$<{Oe }p>UJΈmX1Oo/XsT{_;[y_~8^f]>όڂ5%iZPFR=uBEsՃ&0Q^ &cD,J.:ks+I(򹖜N)bxЄ6Ң6xPdH[c!0yF: lpL(ojRz\TkX*ۦ@}b 'yL鷟~D% F2̯J[s0$E`MArmJ+$T΂#"ᷠoTCPj<OB lfW^ J !U"t+( rfԙhs#EA'J󶇰 T+N7g<"*Ȅtj xP:B4T0ㅗQJcT%3 o#[oAvM[#mG9;g,}s Eh[ -VnՀjm5p[ Vn  X _wX̵Q*g er+ghN&ZB|^hgQaNbЈmԘE iGR:A;3Eb5vr`r6,B:&/nr@p@ 8=̓ ^|ߝ,0Z9@Jt2-e/%$ kpae§Dl) }Lٚ(zAh_, ceZ5cۖ<odm}O^mO_g]v8!5EtRb ۿzv0ٶ CNwwata]0gnby sR 5(O+L$|d߾r7_!'uJH{Ap,)SV(`͜en9DŽRn&x9G#"( -J)MuH!MD7FE-tL-sډ#"jk#X }q=|lρ71g|fu9tM/i/h6`Ι,g1tNmBi$ Tw1GA? 1ۨKaO.'j07N|0^3$42Bqg!&@AD}8C'[^yEmaZS{w5w ipkZvE!N:oFqkNy{U%hP7 ̳WNSۋWMQsHEw1GvVZ6KanflW%ۮû8xDG2p+FbaYF`YkD%MTDjvdnEK&*@3Ab-|X[#d9!ř3JDc  +7TZsVA1h hY:y` ÌМdNdžmcl_#B~5saf> fCҝsb@[ v@v1' ū«JP|J*euS͍+iIP5Fq!"Esʅa=;ȡ*9qD)8NS"F10>Pt{lP%kP qĕOa'ߤn#n icrSZs3u;4ƙ8 IL`)-<TZ9 -:F4/|^7WTnm9#X3m#*[+I[=E_}l } O}YڄIh*)L[BqVt{.+s,>dt-L6)8NFaS5χk[Q^|ss;.l<|5*:y \@ 62( F刊`) NPƲiB*V6q9}v;-Ea]1Z+0V&Hm8S~WMp:fƙϙ.D1)Z+"C#=İ=fc#Aj8D<ƨ4T##bՠN f1d&#QHYu2LwZ D佖豉: ih1r64Eo<+j.G{mgɐ=~GG{Adc+‚Fp'b}gA W#]W]OXKZ=ۘȵR8QLBN~Ps w1auyz}#NڬU~5e-[7v>yΫ4] iE$i_ý/HM>K3a0E>m钔3/:54NidihSLOړ?zD^"@ F/pJÃ&YKy i1j$h^mVv:]N-"e/VBH6Oh`*&r8]\}j#^EX.Xwe<oDKL >Zpm 3cx@)7hX +7‭iQA-m=3{9A [N4R(~̟F| yVOGt:t7W~3Bj5h0h1 Jq}w@xRd?E/E%ك(zQR増/o^rX \Z҂eWȸPDC{0D-Y$W˕涴N"F1*:$^z "nJ!R֦lJqTQʘbXDc0 HljgPkI[0m#*3̅`lQg -9΂GG>-DT?]H"aʓ1NQ_PA%sN(?6 < ӵ~[[չ}cpjGkc&] 5Y/b\gѧO4*0rQ`,6ʌ ;6`+ 5 m`,6di2DȱּwvA]n_5xmle\Lwy SuIZ揻,  Ƙchp݋mzɱ$Q -9+R 3<ۚKlu^gmUԼJ_nb?)b.S,bj$/a ^?)|rkϥ ,Nga+=1nZzm,/!"K=d*cP3^0dlRRa$ R;Tsr' 3 :PHC K)FD¿J6ŀZd]7᳘`e@E`TbH$CD\Y"W&.&39sYЉEDM; ?ɉ*[KK=|ΰ!r)UK#1-l* 9CQD3 f2jBi*X1c&"cdw^~ߐ,>U9;g,پٓ}BlgM:IJ՛\/O/8⼍F10 .f`rH8*:gX=nl`k.PHxZ!F՞2#`HbaJyK?r#=rs{= ,%`[X_?Otl 3/'{C=A=F Ǡ}-Uhk啿l#@hLp=A \(IA9Z.?9׎yaBҍp}8׷l LHvbfb='2(:o;cƌL"^ˈchnD8rm6+Iٌ4 D k)X ##(!Kϸ',q* 9$0 ÅnlANʋX;"E RrHd 3âҌ;$!ʓ=pAXy*5 e;Ѡ} Z/bQPKaW\1*H$MRX&%{am2?ִ ǀT t)=֘U0yEe >JG YL;yN&zx7f;ZqJ7JQ'4jJ:029?^}3;#N>//^_;폟?ҟޜ?ޟ&= ( KǍ$I};Ƿ?4m M3hθOnp}Auӽ J?v~Cn'1O:t5V!wWlhwؙ$UTs*>T+_b3 _/N+Fڦ0,;JPP"$ł$(@RUIqF4XsÑqM0(JOq輢ou8̺trCRʧT: 'cP= f}Tjw); PZ㉍@vHYtm΁**73-|t jZr ӞbbÔ~,MY"$cV |NĎwբ_8|}R(G^ WޛᇛSхKv  9-e)*؎\p~v [j'2Q)]]\T d5JZj#gUɫ:8..T:p!mhAL 2 ?:l'y:-/&NqwW-c^u~/Uں;_V?{Ǝ0O;x ؇> ا] x+NJHJr`[ۊn֡nIHbwS]UXo;X!0Mf#}?oY/MA'?{F-? H-Z@|WorcZ+?MV„!qZ`8w /7L:Y Q2o4G}q h&h'#h2:H޸J19_PFVH~zm$WR@$Xv/yXk!5 IP2GE=0ht<(s._i [xo6ݾܪbzBi냳 ItQ;lB}wu;s~=3yjNIgCǯ# -( {iFYloS2梓``0@ 8 >|ᾦy}5d|&ls~]~ZWxtd^399elEX ьNL N1q#ɦt fVd^K5&@_shbsgkBU{շ>K|WRcR,bϡ|zߍZQ` a<棳*$"-TOIs:c'sU%[QqW?/#u8w=M6eǝzpډ\߮2tٻ ,P1*Q$SNyƒ5dr)B[+ ;S85zomœDž܈Sd^`OLi =e.|ppbqEGVHBNB'KB(e,.&bL ED/u u[R& Nb[&CP 8<5J`~Jl*'HL: TM?`Y\PPZ8g%gvׅsb)#iأ3#ᬕGCG>܎hIivm?ȑkR șR*2F_)GFQۀ17!f(KU{kQ ֛BuU՗?werVal|,c]r% ԮhEȥ;z34Mrg鳫^am`l?b\p8>R^j}~\l{u&6X[s "[K^3h M,L&/=[t=:ֹT-휛 J6y,IdVQbR#6.{RD_ȜWTmJ7ZFx-*K]B^!- YuE]bIg,؛.G AgoȾU3h/ćQgZ>~G/Qa^ٲ^ozH9Zd+Ȫg>lԷuB8,2KݭS-j;ܺyD55kfwk4W^1ϧ+uÙW+V_X':Y~0m}WN]Jw ]Kjk|wp -l\y][ K e22B3z/P4|ޑ,Rmnؑt!Os[fyHEՆ +meg&17JdXax3nŻe-TzXA@'=:}Bi{0q86 4zy&ڔ/-F睍hT1+YJD_IGM/5ߤ|_Ot38'~T%O 5MoΛ\2zݧڟ+'l:\ֺK筦Q:)z_ZZ}}}¼L$;)E-:0uCɊkd :ҡ ;.P}r`c›Ql ɣ& Tb ZPa4- }%Nܾ ( z<&L`IbcpI駧yS_d5`<;mԞ4?v}fVr53N*jMf &(@UxHf'LYbꑠ+,O^s+8󩍫ΗlsQe=|m+)@dZZe#@'ɓ @AS\!r g7T% X蟡A͊MX gǙ'(Db.߂b bz.y{BO!3zZ>/s~hb,X'Yy Rh/t|)R'*U!ڧ/QtUQ# _rANTPW4!З2JPsQT*Ey?+n9+[#j$[쮮{ ok—ݼ8>yϗ9(0)1+RdT m#g9ХƬר4VE ˽]rw1x Ą/T $`Ӱ{V#CY$3VKɠtk3Y*eL,= N9iWDo+ܰLJ+ ՛aUusl jPd1(Ad`Fpi'sA{g1F|ID[8OH*mt+ .JPS~B\ˠU$Xa2!wL0;ݾB񇷿ҟ͂Q(@ƤX(0m!DnrDGw\ ]eOS֝8Jĭ#{tkCa.E$!1styuX'@+ iB2BEBA&&dC)BJX8P1R\V=xSphYŜ[׃#O֮~un\+:J wd^ۃ_h]vF,*4G"1*r@IY9(3&x͠R94^9$d.42+uQVJ5EZ&NZHBX~ (]5Gl/bpN*O'cКRV8%s@$.{Ji֚r(w2D0h3V /$a8B)sIkBL*CV2{-P"UCRoI'E1E2vVV֧B}AQic,ƒG>Fjg 1.Gp-t#0B-țq-㷎Qx\Ȓ 1CmTaR >C.;] Ir70<~xogƿ&nn#g\(.oͭ5J ndR>dJFMz6^da^%K &juepn~ OT)*ic%`Tuy:C2%[6ﯳD?pb/1'|~jE_ݣi$gI{8+] c}0^/6 8 B#IJVgxICQòivW]lkte)Mrbtg:<3¯q'n&0|c{q9-s.乫/M^eU$xJ?u2zE+Y$> &,&M>JDkvuSrU֯ZnjʽJ3[unB#iaȥo b'Ưpg0x `RКIğɺ/ϟ?cǧw?cϟށ``\J w?e1kiX\*Ң]d]]+}>\)=whn[s/7إ2 7INs9~u/`_r)3spV3ܵEb;C\xaHbnZiҽ^DOڤE:HI nQ:_Iy0ZIHb GƁmAtRe97ԡy憤]=t`N {|);W+<ةbg&7æF[:|fm`NÛn+V6#0iP'}dƟo" V_\r* \~2@f7(݅ E@Jtlp Us>˕8ʃ]L~(bH*S)h%!<{C˱8c3Póy=?sPHt~~m;O~˃EU>ŊpSδ Yufï_u;w@Me Pu_uw8IW @b>!Kn"#Jьd3k,PV8dnˏD r`-4 0obĂ)"QsI}'߆ J08'/l9&G EέRȰ, Ƹ#aREb DcUki_nU 5Pn<e'O<>]zߗ'5rRǕ`SS )\3g \d#φ=dKe ?QbXe,Ƙ !QrROVʛ]fl= mDGQdd,9q2<T!$k&āF\F$XABQG" dKM-a$EVHl#ΆX#k\~&|[U7y2qV]?eewN޻eWDD_4N %Ҵ c5Csth Zm6kª{̼lzz}wyô`k(73[]Yr_u3]jNWݔߛp=g_mot1pL\cAS=cc:csUJ'$o<^ׂcou~awW.-=n\.:ӻ敡ZYK0X`"k"QKQƊ1j 3ꄷ%ɉs)h&N ߸:ua0 =@W6O.bRGD* L2lP&qxJYiN# : ,)ZB|^hgQaNbЈmԘE I#) "ZC!fH(SFjRz\TknlݍgseҠCi\Gᚘ!4_,|=(Զ_"Tut$(Jެ#5}@1@?8rH2J͘l$tgRcj=Lof,6+ qVY߾Nroso-b*KV`?.CgZsoC7H:V%4yNgۀ\ wWow^ٿ+c^c'>Nv;/}Fji0{{ZPKCeZZG$PnXIMjz* @ iѫ6a;KH:O׿t{zkMNr야+@TfpP&ǃ^(]Ģlh)nq]+q `clADi͂1EvBƉ`Uw8 gya ydT{KSӇg<Zhg1tfY$[njԆS{g`a "`I(|xrz(,Xn"8|QGXȀqP'S[` !fNr*Dd7r,NDLP#TcTH*&7ÉecpZn)6# )}TC lB-FO7Qlfe2nGB4K±R!XW+qS L9 myMff4ektK/M4&8YuƬdc{{Yk:Wka8JB#Ci " (Aq!2}o.(㫶bjzYE8_BN!!9}.(݅FDw ,ʅgW>/2~FbE 0jCL *3ȩ(1FTMԅ3$"XGTqWЇ04( *{ vC~r.X).ri*@+vJ()%Tꥻ hAeL?~+ԫ]^Hϻ,7.dQĐye%Itm~SJ[TAE3*)kpoD !!pRUP?lPL'cրZV'͝ZNpnmm'kQB7)]l2$tw,Mb5"d}]! i*CZå SuP2w6?70[sO'm!zvf":ZڨGJ*%8{fֆo;fIt}~H~iLM: 渒TR gmm&#QC}RykkX|fn&1;z1yY!`dVPӮC烟`慑!wO ̓4/g1dfWvq,_}Lj]|!s"z>>.蹵|s=?znعf>ZuTAVG?#7SۿzINͮ"U O;a$v*ӯ"鋕|T4Y4\Nz+M{&`m3:Z3V}3m|z][qY Qw8)ǭ?E^uHŮĕOCO4adY7-4od =!"}jX{))JnRŜ(aH-}A -ߎsk .tϖxvt^,fSۊo΂O>n9|-g>™-cy<\[Gf)󃡖'b MWj'ҧzJlڧ,6&m[K<] z8<(BZ)$)rJVG/m0:3Lik3*3X~"oyt͡{iIki@=Tۖ($wd0o[jdR2'3wSmM1vFug6]M`hUef^Ѽ4bZ{H*ŌՒS/Mw5Zi$G3py0q Jd\2c085g7щ[·G'>o.B畧'xV[؀9m5UJEùOwܱu4 8r6@E`jm#ap&LI@ViǬQ`Rʃp-A[/5^#jkg\7׭YM$]j܌\Uf_bM ײ4;]sFQ@Rd-u.ƂtT$F0ٻ6cWHFÀqxk#YzA W)e93$EQ/XYWTU>Mb9B"s@h|F@C\shfʜ|`RHitL*"Q* AeM%1 ~0`AP,@2^X)TBiيeyAΌXWR U_*y"4GHcGCRƲD=}[C*_<NMG!c `8I#jOz0)n<%^s1͆Ԯ!3-HQJ[eg'96p* -ICLH9$gX<-O8_qb^,Ϻ)wӌK$Ympa;]_Z$ K9Ŕ9 Sј x8΁Y.EWƅ̰waer=E?_xjx\0_qƎ` %SDbp.mx8'xy fE!rkr鐬6X .fXL!wZ%{gIBSca;|a^2u'ᶸ2#ɿϫ<ѳgwS DqV <ѫqCW0VwP/\~(ꢾ<UċYpPZ\ϝ^v뾥ogEj_#<7ivC9C%|qKM͐fmlfVYށ!W> z0bwmv.;e/tc-ouM6ӌFHj?pC_H|~=C9BK*@'Ӆ\:ߝzW|ޝbN_/=0 FEi'}McVޤi*M6z᫴|vw^7[+C=sgRBINwf=tOn/kΰpGTRuN*Ҩ]ʅ(t?ժ A84c,+wg}  )&)E)HFkn82 @i++Y=_ fnHGb.LhS0Cƞb>¥+S'vZ٫5ZkvHk);{^4Un,_{TW01 ȤGsփ.d ƘchpՍm{u%`+R2|%gS4a[mc5[V饯.i>ђue/r[c  JC1M29HysQ 9@E*ՋÛP=@\)5V_Pm]_9tn?9|_ܜ^1hx߇ m+?ΏM6{M$%5-w=mþ2ڇt7 rP3PP i0Z2fH#H%(/T>£("g|vP% S ߂>CZ+ZZNb⃕!AQ!#*Rqe^HC03K2 SX('TLe&Sa9,f497q{kj7qpn9B"r$¨5ie]@RGTRJmɌ|,i"#ofa}Eu[|LUĸA2sr1WiiWo>չ%  %y Kn`E-D~Ze:CH>m'֒O`{ bĂ)"QsI±HLۅD fހ@_Y9&r6ȹMA Ȣ`;X+\$JH!mܵ&ɧy/Oɧ57i|/^A3F#,f (gD|A0CAi?ʙ xkzL{)rp毮 #p$f%1& KjS=X< !鄲5(Z=!掵di#: _朸T!pcI$q)ZňK=x(#BX1z)#"b1h#ʘt5qV$XʫA>ADE'~2k Lj=> 2w ipjZvE!N:oFqk|;ώQ7`̳/Ϳ9R}S r6,dLV 2YA&+.TKh&+K@2YA&+d dLV 2YA&+d`cW@ efLV 2YA&+d dLVp! HϷ{FB'gҒ–Bƅ2J=/vK9$l&l!d'֒oQأ 6lIz#RS*e^z J!%u.j GS ZQA먓 ;N; @E\PrZqR3Ba뻾.>L]WΧ_zǹ&kp|׬Te>9}.${ *dëW 2{0}r3\l`Jked j.d.fe0}@ڜĠLWW,?.Iq~? '7|/6 futYђ<&@V ғv0wpx[[AZbdͪxni:lB(s evsE(^:xb:xSJzzq줣tP:R{v;@tڶMWv/*˝[1wn?s )|{gϧNvVc{Jv}Î<6%G^]h'|&gA\2.Ѥ LcGWwhiWTZ8mH' %4廊F)j8x[2`=7{ ˢt+Vi%-cQpBR'-}*# zK$yT}ƶFFi(6=:s=ݻ4s?=F~ ͂eu}יۿڈ"*e&g:aQ$hu^Gg~^gIG&5L >ZpmDf" T3kS;hIGCJ9ō8`1w䨠EĖP!% 3BsRUز8x#|b+|7,(_o}lO-:~q/_6VYh]`ik!WO%jL^./b^YZ%a p *p(e/~'o4Rv8?`Je}+pt'\_p:0(B0( di8^*UB<(bA:`6 »uZ5]ۖxSȳejw3/s UKڟ^Fo5|g2 %}do~yWKӑI3R ڮ${Xyq6fhh(ĕz{o=CTr޺z]I9ec^b:6HtΣ9&(sɌEh6d+?_#j< bup7y"d?h6 aeC[ VH9 hVST4xHxEb Lq0 o7YQmB Oc*5 tJy>%#Kȳc F0{ - h-LG) TzZؐ*u}߰-K=).wIiN(@Ra,HG%Hb$J3 N[hD|A1H+,sFwD@4:&(fEwZI% B' Llc> KB%ER|2be&֝_"ĢBy)>9b U$)b,Q}HX'AuTG.;R'MG!c pcGnx )n<%^s1]BNj$iG(mMPé.'12#Z@|$1Kdz/"~Db~΄5] $+?iKS hs $YD$~*lU0QNGLָ(θ🯒 fɥ~S"П/{<<؇lELq2d @HtD0;tq% 3[ۿO ^Ϋwo%P0 0+: C(XKdQ@` a1Âeqw5LX>lu94>dXxy;S0j`e_ן N2zc0BBa\ZuƩ"ebjΊk8Br.JliNUlnEca|xtr]=x(=asorm킡PmtRsmkPH7n!Ô~EkYD@> LN_/x>f749G%h$FmdrytjIs@ G.u=Y{/p6,oh1#S8GYoP\oo~xc?D]o7XIpp0\4`~Cc0^=4UlE|quSn!p=BZzc k:ݑO>k3𙯿`ߖwCq. PV3ܭ/Xb?A.c)C:FiYx\t;n +J4zHI`>0IA-J5PwEk# I8M0SP /hu8ХtsCR=t N {<TG+@]=՛ܛ?7A`_gD8j@1x17GA8c?=CQܨi1ZFrv"ՉD*"G6ڧїl TQٹBB,[JL.0" NjIˣ}d0}u|i9 &cV[8Ts3fv xeԄUD cLv V@Mqא'j%zΒ鬑o;D .$EPk6`(/sS;ǸίjA10 f1grH8.U|6FzV&Xyå O jg>M2Q^1Tk%i@;r#}ʪgXM\Hc}\,ղ-v;w"Q3%pr;R0E|pop&L9zSA>`o;HD陧% ]AQ>F@6$M5Z uA7_}\Ύde#?3|MN'LGP!;;cƌL"^ˈAv&&wQF}oTPFMԂ!K;tJqVPQ8!l%p1{.D{%}8Rxa맓^^l{66og3]*yvGz8-a:h^R|^U䟽&<6LF08_%s C_Q*.)0"Mm n$&m7;TyP~Db63w?))A6Rf;K`RGSJmΌ6>)VT՞ɃvK!; m6]JεU24FɑDIGQ\l*\'h(%S-(HnUCKXT ez~1p]Z*^PBNcԻ5YHDLsn:7Vb]&72|<RP-BhIڼ7aq~ͣ؜3.X?:ad?VӕɖHzdžsalc9r91Mf>*;bkN=lXF2r,{,K'Ck~-oc-9|'l%tGZ a**g(%\ŀpj&([*< epP*rUf/ $@[d_6eWkV% |\ˌ `Qx͂uq$pJT49^Z a bsB=WJM @VbXBlxgKP| zrD#5Z ,ٚ,0Jں{x/dp 'pǨ DG&x^\o߲,U6'Z鐬C'dɐ>% V3R\&<٣*K'jjdU9挑s( f[4h"ɵ STBf'pר#Osatc5H ߉2VM`r1¯P#Z)DE6Z備)vXēROZY{E/3Ɛh1(liWBFL.9 /6Qc `ґB6P=V {ؠ$v@$?`Ѹ hސ L9jFb[Q@Q l|ZS̡målguG( BIRTʃlCH2!OZF%µ[M`]"Wul0͝ P)t")4aTl')7S੒b5eXF*xd68͠ ?f4)Y:;@z=-5 !D$ 6wSA0d\AS@h5)PNiPHY0(3MrH%2Y,w rHx"fGaB*?%QvdrI1 y*ED1nWY i+XCeҡI  m'+Tpo*>b8%a`F BpGMv5=)(H}YsEi L#R)NURh( 2BDjJc#`V6@۝@VAj`Ǣ7[4qȀgzL Q 6^b_"B_1kP40_g)*Sk}C8I "Dd}e;3쩞tiUurr2PoQ,j78/XP@jIPE"eLԴB*#@& .Ks4m=Gd%YZ{[f@܆`m]ť}ƂչI,TGWgj"ܶM fN]p믟/NNMӏΣN*q>e[=J \{#Bzt)]L߃bRnB%\ \˅&"rjI7k)2==V nsp1v9 ѱc[GFY.z | l+ hwX3)f4`R%2<Ӄ$e#Kpqص7Fu&M3%Yds ` DAo,Y 8\-J1CIaI6c;*gD(! c+:Cє КKd7tc+,vLk$$RYxf@mJj"*^ `NQc-MJw+ a:H؀$`>o_Z*A1dJ-6mM:̃+vP_K6rk;3`FP7C#uLGs ncJ`aؾ"'2Y]aYT-G(QZthƤW'#1>& }IaFhQ"}_wCP^` 6%TP2pyZ>?7 NOr9Vp A(L $CB6Hz DzE@z=i^'ægc Bf`+BJ1$'WW(#ķ7(.z*=w((c)U/;  3^=&]B =JmDR1uTl:#rqw 8֤U T)JK@KUIu2 $jd-ҵNZO[i՗aL}c ڊIcT>8F +(\4JVyaef@h ȕQ}!?MH( Ar|d8NrOQS{(0q~ %!.IRH `~=DaXg7vR%RrD|QpHh?Y+M`$ 9D. #&J#׸%T;ˏp]K}wEC@xR r^ 9~_Afh`j^7~mlgpr@QC4(!ҺU-?WAgmVпޭN鐺థ1|Jn֒}JnU @Vҡ@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X )r6Huz8JNC(`S^ +`%ר2J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%׫hsHJ (,`<j@3J%G9b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V}J u gF kxJng%WrQb%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V}=J=zZwѿ:6ϖǵ8:[ [5!JV>5h#]B Yҋ/QA[aʾ~pXȤ\ْ/6B*cp).~(>igN!@Xtz?Y`?<ÿ\5IJx2`Ro,iyӮ/c^oy:go^|/bOO%VlUlXOz3^,ndj1by1˗Wqoz1^jrZ/pfrVnw$K.41h@1!҆. dT wc,?.*G.ǚ0ƓZś_6F\l~L=A~L{h/ѻ|7v:~zp5>; gp_]u9W~Z}٭.t[t㷚+Tn]Ug*濭#_ܟV͋˓'z'!; B5onʂ_/s;~=7߱Ɩ\翾|%n~j6ז0[7/oO8ؽ}޹gmw<~_m< rY3r uO҃Y1;xxd ޅIo''G'끹8|y~y}nR&:.-m}ZȤu6AKAe4gCZW_VڀbO0.|9bg"e+iH1A4?,XAă:aQ,3Rh'[|]φJTdJIZ %CmbLËLM9;;(mxf*l3v'ybs>@T;@ŇYhت~ ǐg3>J̳mW/#裿2,1@oP:[ (݉ . } ,·Ws[8y, rv62*jIցPtJauB?/RF \AH<?Ma=¾# =-cĬkI-U׈1*H\AyiI'X9dՍ|b)b̹#y~sqD~r>ˡ݀ׯjg・7*L3[ޖuQVeU:y[Uea!\R@!&h3iOюv ̜umLlP {tla6{r4޴?/ `˃umuzFCPJ뿕~+OukKlgAu{$7{~xk}r{Qzx s'⮎g8 9pr57Wzv<3Su-*9eLCH 1 f_~⁉$c_>P:d|S_6<Ʈ'{NTvs܍/$vyx۴,v ?.FZe.-L`x`brtl7lK 92$܃>?1I:w 2y" <`|Z y0E✷m !z? 3C%#佷8$!$!TiEGٹbSh^MM$5Sc"j5!;Wmҵv͹?`>w ፫/ WizϮ2@ؾ;mOt"Qpxc>Ful ݀o)&5l 䇔jMݐDM咚Rh}_$zoxFG8T&֔uv^f;VZ0V`|1%5,Ѥ?$x؏%nW7{72F6e~r璌=jR-mU{Nܟf|s;Uwm#IgÀKf/d 60ik-K=8WMRO,SsNL6ͪ_WUWWT`h/+GnJV,+@(~{G+_/:x ~6Ea_+Zy,K .ľB\5Z+}ġP>3JHQNRa/aR[@,Gq@81U}2?GCTR1F-T)<(Y$CjӠ5^7wLG +zgcL"^ˈiDP ܊T8a1r 6ŎVco4wtvzfۻͧ ۱?|[t8MHw[ju ouvT V;ս\Kڰ(Qߚ:z@bwuOa:{ N۬d~6e-[vyxsU-zkWz읷y463xVo踲]fޢ=_.bj֜f?ٺ>SDtv }#7o3D`Ξ*%E5`Y6ʴ+rC$ϹBx.+ϵ˳+Y3+s'Wyyqi{CU`t峞6Kg/N'D;i{= )zBc>` ?_fZ[?,KG~urkwfZi fof]k{+ ϼezQؿLݭ-hZOQfU[ƚV.п[\!`|dYZ#WYZxb9Zˋ S Koz5줥tQZRV;v3lhmLWw7 \-ba!eԘ\Y1s]@j,ذ=KU LR6 bܶvmsUTʰd*20-]seݤNbhB;i]բja?mqҀoCe;͎GeQ:DF(8Sd)Fb-?`" :jη5)8Eh|DBmGc7EE)FjcӝjBP WB+E̋23K>g'—DlZ|IȝMO>wfia*:pp=`ho?AЮ?xǃHroGv;H*犵EA@(Kd:rV, LO!N>9+O^Upa Ixm0½UZ5] o[.p$O$HM *YT<}^YYucw|M|춸 όƃް3 {}VIoE .;Cx7fw`JpyU _OS%odP}ywę?HߡETzMtZ"\F#@e=CAwޡJH##\i&mX -Hk6Y3cx*=͙)a4ZAM!cFY; f_"bK(F[ :aFhNP#g(uGş6PG7q};}(m=o7ac6S4WbZk&#W}oX*w.LR~u)-i"Gy (bY<ڡ\x/ܓ(ߗJLsm(NBHbBzJ/EAQBq7l99&r6ȹMU5tdXDc0( 4H0S1\KƦ#g / oTͤY Ez濿;j'Hl=S.V|&+$ wܬ :E:P9 9&j/9v3|%gS4aJhCu-7%//wI>XBR 1d[:.f!DS*He/߮B6\U @!Ue#h>+Y4LrsZ^o{ګ IB\ⳇ\}LjN+?ԆrsBKiR=J%fdo*Gyjeۀ*"g0;%Q,[JL.0"4*u%SdFOVZL|2 "0*1p$VET!",+ rfԙ6fA|UhSDNy)C"7O RZ'42!¢b3ED1k` /&ƨ%J#0fF~y k<ݙ ˪/h oɾQsD 0lM.eqx:Mȋ)np09sH[G‰urZC2z}c# ZGK!#j|tHbaJ[?#ȵx_ׯIO;)=[45dCdW6m·z Se/U -r,0{7pi13+p'y: {۪vmMJW$3 VgV 0dJRiaKl@]l#ӹ5~yNOs[ 9Qj #^F$Ԃ Z^(#,=Q[M*Ffdb5k)ςqƱ'kp|LգqVlӗ7 _$;v}ûu_6M<}J'FlgF+0tTJX9!(4ӠAqsj"ӜF? B< :D{CJGt`ZD"rR5})r6#qy(wڪaV-jZg2M:KNk#l&D-Q^8 Za54(rg ,d 1+QЙ"9AQNdЁQ21ٌQ? Kx0슈aD-"H88XG[XA` s`&@ׄ׾f]vvP^,V@FQltNr'" ${@2n>_2wch$Gs&.7AKBѬi*~:jG?2fg2c^bx>C03BQ4UJEù+;Lܱuʙ Lq0 ov (W~0 L1jveZjo8zkvcC R>$kmAȊہ-7Z*?sqIϝ"$+4?8Nח"g撟/yD.(-  I_YS$ [%0W}N{  =j(b%c2; B#L9ERP;5aݻ@4tu^ݛ gErkB j!jr1bJ 鸋AۻH\zD>tv7>dԽw٭)%oUTrLQwxjCN_j*R,^mZq3U&?g T_;?GWՅ7}bslqcuz۹*9zGũ7W7n3Fv$WtS7 Fa~Q0iJu&=ٹֽҵ~K֍Z:L a"#i`C%k|~ qǠSКNeg^y;s+`ǿO?yO|ۻ~tx?o.> /0RZZIqGpo}Cc0dh6C&ߺmMr͸ŇmMB(2SϿ ]D?iijB2=#@6MY3& DHVS܍/ D"P9m$v՜FzLkvY~CF  )&)Eɿ8lm$A#57 ғ{&~8ܐt]b"`ɘ!TcOYN4rǔ;l h|{(ǍN ;i;%/u{ dMwk^:5ЫaHYHE̋2D iƒɯ>}%[98)~!J ?#LE.n, [cZȕ #gj'2Ϩ䢠L 0%Q2j!g"@4?EYt򪊋 E\HK(h)Ъb,N}r9/'y:-&Nq7˄ c ,P^*>/ 0PYK όƃް3 {}VYd vvlFM_fXcSRSNrV _OӃ89"JK}4pUX*QPl;"Jky4pRj},pe]%*nG18ueBٳy65.)LTitT;^h$/s8 ]1+axb&KQgIͫlxݹߪ79C~P6j t[U$8Մ#Zmj RDeMbymK\m##DGWGc$j>tJT<-\0j$c \%j8tJTj +DuTH V \%r 9J{p-\DL yL,にD-=x*Qx W? \-?=>Eٻ6rdWCۼ_C6Y`8b"meɑ$غXR[,9 &YdW*2N)3 FW?o~9s6<}V+_:ӫo_łuW>!9 UvTiYI/u3Z i3'?'{jIZRr x$*4Z  AY#T>jsZd}%JfT}Ƴ\2`<|W1+.ax6d!r,ThrADR\MdD =*;ZsO%)Kںpfgy[K / *Z/ǔe[W|BA]mۮ1d_\Ooߚ=Uo݉ݪn8 M 64H,T{{G{EjmYf~(}GX-?_Q[~ W#W/8jG9c44;l@5Ǣ^ptIA"JB ס򉓊% :oc]RGRGw ThJ%$F,\>dv6 MȨoE,t.L̐ru1{K%`u<u)gqVz=SbttKfqNk>olϷ-]w]+wKF&pNՕvsfrd!e3 u=ܺ=.pe-eJ~jb [.jvJWZnjotvmxv{-]-=V=3 y7iwm6jdjwƃTѬ-ΕxW"%|5fm&t쩖zpR1-!Y+Yn d. ,+;G浘0 jYzIk!*"u|6zP``fX~uѯwL[gMcr(|:(j BD# ,5gE3%C)T{*EYrJM;+ fOO~˟'yýբ-[)ެ&50*;W\6e;|@2;pyDz$AN,XP:-wLZ tmYY5d4oX rIoG[e[ L o&hp$ͣ!W *ƭE3l#Q)#8p7-]{\ /ô[|05<02yVE5jv@J K9}'_J$G]4+fT%!muDVT@l奷Ub ¦|G\rN%{e'R(ുK#DЄK*ɭE,;zW#}`͉gȀ!uI dc9x/۪)b,Y0\ ZA#hjx %K`*l%*.IC;hU?0OBppwFQٰڡ;kwM@Jy~^LnVE*:aSe~DH D'GpِNhWJ"ʠ+- iEI 2b "e24ѱ@CF3.1*/kML(|LZ"ϲ{wCRb|kR$Y8)En'70L֠rˠVYc(q`I r8,Fv}XxެF#]5-m;FWf=mJC$,dQȍ :I5qL8S6c(-6RpjBZ**Ѿ -i&4wJg,ENK!YYKvԋՋtzӋGvW/rB٬[u^Q'&Q~-Ä1ܽi ςǧVHOqd E*ѶC{&!1bkq1o;ҷǨ3yV+iqף9u+j{-pOC$VAIs "I /$Ŕ rSd'4Lv1Rpqqs.Ȥո{Ж&Ji۷Iԯ;YXΙ) :kNHpx6Qתh169ׇlx-ymq]v}xR:,L9XN RbF)gy&bDg8+Tap=1iw*6h|[o+Lȴ< " N$c hJ5 BT\f @{r{ X3T.qvo9OǣคUZX[&D$ŕ8C=0S}41yD|h.Da0t\[SA7"1:@W)aj)C s>1RtX, ])IcK:Ӓ:$6DA2qɼbYY>`Rf.i&Ra:D0~=uwjE2_%\ٟ5׷Uק`rZ!?45 d9qt{yR56yެyv3⯵'\O]gťA|ޮ튣Ym ;ß!M#)8t4 kaV= "QyN>n =ٟn2y$FmN)C\HX"m4 SzB{1==JB6t'9?=9wo]~|GǷ}oP W`CNvH|2 x RCxmV%g=_6ڒSnH{A˽ ?~f?'!G,ӟ|tkB|ЃWlYn^S;TY*\BB b݀@_]>ks\lR9AmPʑ=H퉍S% +YOWR߿nww4.Sjw(Il$.-rAZV&9&H&0'K5Imm+˘G:yba' /TE V.,|=K42 <6vld(l⤎H^#e&Nˮf.MM[pUaNT+-]ݥ9-h)( 8u1P2b XH34@nH/u4I06.chw]SdEF)2. i] iGu6,|~͈̋sAt +n@E=sB3~*PA___t=~(h}06en\2*^E̵.eNgpNKodOq! ºi묭Td40l9=V\[t6АN,켥7NfhSxۇ+>;: _Nmvg"ؠϕǘ- 2q"l #[ ӻ8P;>&̀@ι!K/ǘfC/) xOLd'ͥ>F "աB!0@$8M5K" *FļЋJ\̐`c"j+̸UOΈLWst܂zd !&\deer–ujB b:rX^:j0:iq } UA/yQl~Wj<86ttt$nh? 'lGt-Uf bQe ~߰4R ^l(}wV=E8&}r5b5>-(:)|qO^$n/H7/iN @T O'ɏKӘ o#$Tka2 5j>2[ڈ$ H0;,'eՍw=kn{^]gZY1)2S+mQ {l󨮑AIٴښV[jkZmMi5ִutjkZmV[jkZmMi5ִښV[jkZmMi5ִښV[jkZmMi5v:G?$HAնݽM4˶CB0tڮhԄ awU %?G7Ķc)mJJZ ڇyR 0fC! ^pG\OؠMPKSЋQxxL^'JCFC(PD7YPlTueNh|VXcL>MT/7_/2r"8朠rfspN(pQE-"g<79B9ﻝGows7ݞK.'WtAk(v-قo[~sF3S;A=2/cb%wQՐ_ t$@x-\r#4-5㝆u㪷t udTi,3)Z)CUbdIx!&e Xʔ-rluw֑wZ(&nܽ&]@]1_*q4dTǶXԻ3fZ fd5>h.;!&X^1b|DHYEߠy6ǃwꌁc=mm|o\gUzNקoRy覟Nɥd憖PRzcp[4F3CD\.MJ窚ާ5 r/<3+s;-iW :)< Nk@s+]Ί*c^ ' ;ՖLSd[x%+?x`!G\Ha 0eZSjԚaMwӎ4%i)O~¾.|ӕOwMc;HaPv&_)lٿw`?͖c3oEjT>#+ 1pg!B$L6I[Kt #(Le4 &Ц1fi)+cbc.cGNQ1tA$`CXm:#~rXX2Nr,-5̨{J؎_,^;g_\@hs0|`EUNRE eӤNrD-"DQn=\mfB4%5ʹ !T#^# #om~7qT]:Q9922 0)"3 LNXDZ1HnɹxO?dd$bcdƉhQ+lLdRd`49,8ʹig*r\%NYBNzbgb8i.1b)&J<,%ED&$\fId0!Uň#0!e EVWqH21bI1.!ADyh^nJ-bwnGn/,]0*#v,1 h0fpd:A wVG풬|$A]vl>32/6m?ge M@]B|;U[ZzUq\fG>c VIzFJLXo4"#uC2Ջglf2vTuWd$?u-^HXC۵6Fk9D9Dl20V:)}DPLa#ɍ-1I<>@R+0 S(1):ӞWH[rHV9Qk%*F Lshi˙ $kQ`S`Aޛ(ѩ2"ol:FhV|2wW:3ڶ&*MX_Cʫӷ sxrFʩK'_歆~ 4|4y%ipKLzdkwaUɂ!ާL3Mkƨ ~6-O; i?5BZ,ڰ Ea T*4iYИSZ"9`vM?IyX!}~.۷ j4AeL. $ &kj4OFVۨBYOLfo0SEX18 8}b>MNi͹ J8t*xksCWaފ>OyGWmVl),Y^), V.~ !:%/dF~+ ^T8~}_/>ݕᶏֹ^qܺ{F[i~/;oPie.N:}E EDQp>T8j_]y.gn!$@uE c.g$%T{fy%M /7AKΞ9jk6#:fQl$;#֯ؓv\P0tqqXJV4E~{.WR<}/wz/~|/~!(A^7}z }=}%Rt:+׆&W,$5*Ghu ٯ)%;zNr~;CK S[h ez;X/[+L[17fcYpZR>a):MDU\a-Y MKF*sc:D:",I0p uDTrq<'Tfnզa^BΏ-V&ȥ2~lhm;q\=˛U;zx5ʥo=wx-w]rYf$RIh1{4QKY,ɨL %Y;Ȟ#HB/:}՘=4(k{k:Ea4 c̑0M6l 0wt?Uca#(۬pͺ kmY c"2(r7 9C6WtVPjj4zwidMmu^lepd%8y?=tkd("Q|5i;>p [5V+GH竆r06bEh*b8/vyUrը+пQ;=RV$>r1+ޕ.T1榚d_ӛ˽ؿ<#q/?}7?|W|{.{ߒ# '+<>m l_\ƛ--l2Գd\WWQVgn )~az~=l=Ły{jN}X\~zi+4m-[.8E;ߛ>gU!0v@`]I5*OlOqy(«2CPpU&N.:0_J!-q?fuؔҠ B3]rDT' Hg4Z:^uDomnd8\B|Σ!I,!9G0@dt[Gqsq‚bpN2EBE>ŅV;KXXR`pV .OOg8M{4D?a6NzM9W?}7~^|/ed5 .,5]l[r~XO]%.j M_ 1:M%( a '=5Ry)/mt1TRHٍe6^6A h΄ЩAJPj{GJ9 ȁ c1 /b 6Kfafar> #)P9U4yJE҉c &~#byrQ_>wX|\l\CM2%Lpt2 )skt条?ޝėi}z+,e˾jeQsa:ێHɵ1ٴӏfčU(YAV :*#Xyr&,K?x`=_qGzvdYR2d9$=3a,ǸV9kdNv(n{LCl4?O t\1rsX,ٳe[v;L :*<mRev;(r|ؑzO-^vW=Â.|M G* J CLmy>Rr&s __ϺC}M:0h^@dzN$2o`1AS>gʹ%\KH<'|CU?3NYiSdW*YI?+^~JE!о1B*i-pT 0+[M-"x#B8*ݥ/%l۔@UnJuJNFm֊,is׷Õ|X{ax.+_ Q>rOTY&SA2&Dmh \(d0I '3GHr&6eT* "yL6g9W@Kښt׌QAta5x Z:]y_EGU pO/Vh8W/  .> tgC'^1͔MKG!sKt:*I1TU52~|L d ~KR0S2%1tY4Ʈ65v%ry,Ztlmem;!ص>$$3^62qNAؤM$\9 >!7*Đ r %a, 5G,)Dܟ̓a<[R?Q+h4b-V#uӈ  A2.rc̈s[GST$yX3@H& \LH!Z@1`7bݱt6s>CzՁ{=mcuV-"zX;Dtzqzh Ւm!vp*܉Ū3rG?l@F4D * 1 Whnm =ώOx6J҂bpсБJI8I{rJo\@eH)CiH)7) V& -~'W|diV MރQ`0zf&ii f1C/'EƍкਁPUX(Vb >e3b= طI P)3B>)("BƔI!yv(ZR(IJR^ Y(X䁻EEF&stզ:&Ӗ@V>o${kYcZg079!AA!Pաof-ܼSls9Cl d4i،Nes"$T7`84s.@52onGI:$8V+ [ͥ>FSM(aD,-芖NX}r`(3؈U1d" %ma@e,%LI:s [&FpzMg:#cs9 ?y_{Vuzw >)Z h˹:x+V92͕v3 *xg*go~5k,vy%o˂.n0:ھ ,-W݌t@Uj |W4f~'׳e4s()fk#3A\8dS;S}b!ިwRx4.dg*O:di}k9kͯuYl 9NΆ A{n2zd%r>;) s@Fަr&y|{2 Q'­9:g!Bh_ hSDڒsGl@Nq.Fd+DmQq6qt>">F#,c3ЈJbI{d"2UH$q]V Trϗ51Y.tdLgϾï7& }uHXsAALevb%񌎘g63cEWۄMb pΤW|;"~R905+m#9¯2$@F0`vxB#IʶS="ekKf_SUUQ'v+]qAi5vtJ aOTwgt>O&qb7'xsYڡE{]le~}TA% S,\T!;&*J$ :.k/3-1V c&PfTetۨQ1,%! j@͝:t6sJ rW=sV)iVZ)wJRnJb[.,d5',-@1,46ol؁WZ+siB$zjؠbܶvmsL&x욻-.vPiNȁNZ/(E [7$mÇvz= ˢt+Vi%-cQpBReqޥIfmqB)B8}wtΔ.?ETzSMtH,l 8,keO>EٟP&5rqAb-|X[#d9@f" T3kS;hä!cFY; 8*h%{guH ÌМT3f٦M4R%O.>wF|ͭg@1xn0p}Zzк`fikUN7T/m:+l_o{%7=Z"8n&.ʛEW*kǸ./aao-Y-{yܴ> 8V"]%6.Wm'S_Y%uGqzguF?I/&O~\*&wxK;Ao8f0@;0 )zm."8+ oө _Kx ]\bҥVpb#VK[&SxӋ"|N|]1Ǩt+_xòwqO/g/I NV{]2+p'%X(}>By[ezx3x2ozw{kpv}MO|`3q&փ>KЮay~Ox?t3VCX~A`Yl@Js 422E=q")BDH M)qZFe)A@$zq?t֜DfN^n,t\ G+;pMUe_=Ě v:%lx- =[8|]nrGFIy"TGXJHFSǥg8H:|P0. Th40`T HitL*"Q* J3K~0Կ'oNTxە+JY.B/O4fef֭_"ĢBy){r^q(AH$MRX&%B)FKu4mHc@*`S[ kLIju)\#{`)n<%^s1m)/#e`[&NrlTj`'12#Z>%bpUϺ)wSK"Hpao;]?6I>6"rFEKX< I?Ɖ w%f"`ſ7O*S{SGX,)5LYwV $"a :RLݙõޝ G BWgJ`PVt+P&$\:$UX=]̰Ntye]'* * tw~$MS!ϯ]qc*A2%ٗo*I,#롰NEV`ȺUKTKOpV\ Jȏš&qdf_+s?OU^ݎ.'g'0#kӋxo휡HJU o#]7 CQqybC_($wFgcv;uSqT֏Z/iƵJo:0 B#i`u;dp;)1#S4NopK ?Ӈ߼{{wsL뿾>?K@<8`e@E TbH$CD\ٔFHC0.o2AXL;KK_?M!:0LHǬp+ EcU#DC3^x54F,Q1-fa zxwOVS72lA&uhb4op!9Tя݀b2u&\{\rT6#ĥĖ\jd)i$ kFFsj`vOH>hoKTx9ƕ׿=;O r,DDꥦ0T<)0Slo mnVhtjV8'UՄbTC a:ZYK0X`"k"%Ze*1j 3ꄷsjI)Qp[$؜gmU#0a梩Z/PrehVJ/F9]4<0Km*CA ƙ)͓GDKS ,*)Sȃ:HJ#+l>  ldd?,)ɦHX[xGjOrdkQU:|5Mek 9,R콤Dv@q/jkz0rVl>ߥwd v]\> $\$I1HH!|fSϮ+2z[ i .?&\݄ ]ހQc?~a^ëmEZAZ`Pi }X8F?7 VPrjbau?qS!֋0>sٌO"c0k6$Dʡ3Jx@)MBʘNBĉL y?4AD$H6a4*u! OW#R@ r&~(]v+|ػXcDbȵ*qBPsI DX' >V`ŠD(F<ޅS3bQ?9p־]uqXm}O5 &KkfRY@NbtVS+EX!W>HYQ0QɊ-Q<սf^ᚅnmz:OYGYNx誇-Uw1V ?6rm[]h2FE* "r:Xp88^UӨt[Ek*Q+v"WQ J#Y0b'(u&PȖ(Qd?'de1LjGrF"DGʠLk\Ł*(-:JWN/5|hgrChvNOF"+>g#a?s!*28'} dK-@v|NVALū«KZ>cCLf 5_D7P5%")dH8"+T=Ts)C))+{֠+ARg61I2ͯQk,!|Sz $tLZB&(sn}Zqf Iq&#\b^bJ#iyw;o_<:4o|>ϩ{(WF5r.&)Q%EfSOV!_,RfpE34,h] ɼ- tJ)@uuyOPNFilsD%B'Q3Or$ (&,wM`!Kldhښ"qI{SD)JSk sp8x8]d^,U-7v([Y^Fd?U/d6%|lR w#F4@hKǪ9Vͱ^\taOP 4%R2JYJQdQQGfCً QD5UM5׍:,&QM5I,; @qD\Y /MXnqVF_7pص.krݙ|Lʃ]Tn31)|xѴVmΟ<֡MsSSlB]^?Sxc5sk4k=,`evMWZ"hzxmUϏwD?Nye;=dowF9Bi'5wQ`4c^s4_4o'kK机;OjnU7nN\I=r^@SM |+J ᄔA{nEðg])ZGuwP }a~ig< 7?6NLz0ȵBf.VDRtA> Gr-ڵhZfEU|I9YD^dA-=y?M ւͭGY/xi9^*_Dyg]9s gVƬy9Ь+૷fgfGP副hy`A҇^%|ԓҰzA:mKȸF_w>UuIm;/fBf^caͳ,{Xɉԁc{Fвa'ZC ObEQd 8=3hV;,Vo+OQn类QSzGʧQ6u^V]ֹ]_՞ xtcJ N`xINB"rknQӄϋ+w7w.P">э,>yBPU+6@Rf= 3+m0\aE"K\XS+߮V/:hQYjfRY9MYP+\$X\Le`^_}6Ģ9{ yeo:sޮԻqĻ'$e)kA# ϩyLS%o=[>芾ֳ[zS]Vm;+mfq{^luZy#<ߟ5a׼^>}BS1.-Q)ڈ)JB2Qo &:IUpr8^BM.D뭿 HAxwf}`Gh-8q;=<^yǧپA[+HBH ,Pi }X8F?7 VPrjba) ~:Ba|6>ѳ,8E@?#`&&m"H*%g3Jx@)bVĄ 1a=36pY5x O'OeOGTNņ)d|vNu4uO[{{HB`>>SMk!Y wpۍ(̎]cZ'I&dqBV:ZR5і_ pq͌me(%@:&REB"P8JJ \`)U.r+@[Vˆi'e`ԂYBG+ )$55XY A$;KxPN$r 35HLwѯVѯRjL„[jb% Qf| 88ϭ2מ1<؋()#yvUƮ~PQqD{ |#cZؤ9 jQ'órXQa22FE* Te*Y '5D^^{^;K&gB@e'FjYa,N6h#wzmrF0deMjG"qoeE#q` <(pkiFPޥplr.d<8-|֭u/a\l~P NYvUS|V0@*a \+G @ՔGd 1#D@PDPvcz,Qx%#pocE0SHˤj X$ùLj ,t Jnn*hMBwToȪŧd=ƙg޵B<]]}[ gs6@% AF_%E)6p^Dɼn^dkKLOOUU6L'p|t4>ļsᚖYSz'=kO*Y4)RX,CYs΄wZ yJyxJCFH'|A(xD4=YL2 +AUƝՆ#s5K,:,8tB?nOFromM^γ;\pK4'M%nfȳev˥gOkbWN֜8i2 =1֌/hmNg֖wK];ennϮtr+/Y [y3Ailg9 ϸ˓ "ǟO .;SR%qqs-SK\i{_,Q$Ôi09xQc#;vGvsX$$Sio`&ٔ4te`1b@uYU&gN 3T\Y6c1~:jeT:ɜ,36GDGBUق9(}3-.1ӴDjMb)ǫb]M\gWێߜHߐF4g65Acd)TP*ު[97cn00xmA.MGt*{a2K_s`׳_Z>ҡ[j.)Қ Ia*1$t΄-J k9j Ӻ KϊqW?Zl|^b觇~lOx WGZbj+#]zϿM>KW?}75oU:~bVhC{7t`~\&`ׇ:f9<;^t %TT ٴ-cVibaf.N{S7x+@;s)۠W1>B#$'Vcw>ó>Cq>g{{Ek\# F4"l.R,&`dM}V;l>#! zhQmrRxZoU$c9C Z{VSX^ '0osEmH2A?K^~A!'% 02z p+j,M!Py[yXV<,%^U!q08!"nIF/2.:x*y0 Uzet1*oeL̂"IH]>:EI9HL BmXm8-c=RV^G[O 2^==c,8̒WO b8s(iLBl ,)j"D?m222{ SF!j3"}HLxDEtPH)jQgv hZC,hSV[l~8K\v5jVtVC%GJEe3^&RqںU` &bHd)tQnmi@7#1dBEA)CL,(H`"H6>p>f@:iWHl-ǜx6|kmeh;YY:zB̳ lY A;!lIɍ D0I5sicJ#q$Qpr3>`0&~1i@-J[CuW.I&%{Edu".vvq'Dx{H#!cq󹠄$~dMN2$Bxbx.tkye{0a+mo*cS\xGh9WFit׆W!Qd?'ztϛɦ3v]m!ujoqqكHs)7$>BØW%t3;Fm"b7^&qï7oǟ޾`޽wo h&)Z k zدg7~k.jݚ70˭UͧM.5y}?Wi$%7a $o_>8qMUXM]\b/οB&5kr KQYU 1!?ݸՋ9:x TIMZp1D>C`V:\ cQ:anV}>̳ G2^ ˣ>Oc)0aRLөY 6vllj^\Y_U6d3fO!nk^]Wuggԝ]E} R,Y*$s %’  Bj&rѢf'фT1^DP2cʂ1.@PU3"qSJ2\fcpvt8 hx>$\ޔxKZCFN|Zג5;ނ=^tvP0BiHF٠3́). :D8S(:jdzcogP7M.7w{Lwxguq]9KK]w-w=wen tn8h>zp[K$oߠLFLyZYTTrOdhTuߎ~/Tdo`^Nr0v]na`P8LVe;~u,v$e58Z@%-(%bAN Oap٧(Q rڒrq>s :<څȝ3m9A8XD7<S1G.01uT> 1. 2r29a:δ!uFn_CEx8A:'4Љ|9Cy3=9*w|oQZ20Z8vDH8A74i>R 5Bߠ0RMs%Qw^)HoޘՅw![C7]FE//:R֠:~R$lܸ>K5 +j0cA,K74,^vU8'8 1z3L`l`\2*("`K>0#ik/|*%! ºYk3r1+D-2A+DYm8۝O/99L_HdOd|_)V/9!`ձ2wէXLObo" U-R^C̳̹.^Dbr:P૚T N؎n晖&B3dse,ߝ*9.ٻr#WdygqlyImKނW>:n?+ BM,W!Srz4t')ޤ4o 'O`r~o rՑhnjfwj6st!sU O<87?\T7ڲ@VśվH{:60X|&31IE!cvl٩KWHD c7`y"Cm5 1)&K1hhM@a 1 Ux55+U˭rlP*-dHYBs$-GnbSYSzU@H`WY]&G]I[겿UgUg)^ȣ3?t\@0Ղ=U2_M͇75Dio,j(W(@ˢmvPrΧpQsDLS,vLdo 4@uOU>+dF w 8@!p!q,t& NޅJ6qZRd-Wզs`gd|pzsj+~xM, %#&ӧQ I:aZ%1_}NUi8TрAT9'`>bMF"kLJVqȱ'̩8Ej5Hڛ0f9Sd|j('8L6%Y64ӐJ*&oҷYCsr:5bfådd{^gӎgb7$~ 䣣D1U'8񚶘CG }>֠"t 0W Cb rΤ7 8)@k9 QF )`l26-yab8ߎ۸<ӏqw{?fx=Y|k[r֞YHl9Wg!mvO]GӶyX%2h;6K0QfB&MpЋҺV}X|лwmWݵOz~h\#畖[<=d,Bt<]NN[a[=xEd^/]W^vfklgК+[gm6\nuC`cJ" 0B/ o gY_DuGًUw "c1 /f %t lN`1bj'D9&GgNǎM%.XD.ܘ6KxE/+c .Ц!e&I)OQ>dR  K^Gyk]oR[W\o k]ck%7ξy%0҈h ˆG?l@D%Qi\eFʍ5AdEJЩd^ Uzv9ϭuLUj+Pi (`:&1L-c(K*!tĉLFKʜmi[G\M鏖W_\) (7] >^7veH{B63EӴ->>ϾG[R-ߦxmV=i*ZocP+ nߓ^`!C:nfyެ=:jy 3axe[v{%g1nf>ͨw34Û>Wvg1BC{kT0D#5'&)t>Sg}x5,`!`*TriI%eQ2m%SAYm-}NyMꞠxj6ZoM"c++ F9s6c#T .ዡ['3Ҁ" y[B/")RKM)X VVjٯx5.7&Z.yyXW<}iOh9* L(S1ʘC7zQ`R1;Rr9ɼe^(_Z41cƨ.b* ΄"(4h%2F[bqp]xtp\Fd{NgդH\T..*b$m֑C(mʬ臕Ju*?N_jk:DoQ\ pVzbbHk.5GIp\vTW:`e[>L`g5ki]DHE1geuZ,Y0X{$?kyR~ fU[~^EX񖬹r XNDp},qcz1+K [ߕ3I߳_{zxҞ=eҿNO'z{dL m=ur8g\2qq.x' ɔ@?[ W$Y9:2"cz$Ynoi(Ʊo?Yk v鼥v[yK;oi-xKWzKׇT A[rJY$Xm&H͙0[r ;(U Jmf݊$DI;f{ L$b&A^tb&,M.el(ςEHuMsYjEV8NvBV^ a"{>a՚:̔t>ep4\M3<).zN7g_\MM2%Lp[jQNHMAXLx, z(Nz^Y@u3ʊA2dsn[oȾI3f uB;C/=ڥGI[6E"zeLl#O28Ls&L%Zpk%0#oG^(q߀Gy ͭi@,ϛ[œ=ylA'z $PoF }=54ol@Hɪ68"s;^gn7.CvY.߰- y06`N!0gy6FrwJ\dXL eM+Y)ie%K ۼ-}$.2CtDk!$2FmЕ=՚fv*{de{}MJo51˼1ƃ N5Ӗ1tAK/m7_ U*-~DIw5 7L^DZ$FH%` )B֜ 5_ky_2C jr)L:Îp+$F(#:'efs+ E8ry4<.É&#fBR㭭, CFR`vAOhsojNKsdIX[3VkjXTӅ$cS]h*B[܍낌pnI0q. />Oo\c9hr+ȼPKP d~d RUL!cϣ`Ү,l(3M&ێYN2V՚ZcΙIǦZVڶ]*j d)[k#ASE-BLPrRf*d!2@VtĢIب 8| ]%\u1[jY[~sO$cS*kDiN#.uL@h6zyvK1ό>>u'7 u%\˄cl@r54 r3Ctܐ}#YC,n5b|ha3Y\&%E`u"N/vzqiwqd8PVY>di磖%9 D@/^&-TowfY.ky8J-y)rqEF\DяM٪F?f8O|Q!Qf!yQ9OV7ʐN9>.z]H,B7lCRrܕ7C3zNx,|ZY{ 8ٴ+1oJԓK<`B㒧UP?/Z=ஓ0w`ZmrgutޒH,IH&J#1"gauK6 B)YUdunVjRg-IĿ}8isb6q[k 9 pFz"2j \q%ܐ{ͳ7O2wǘ,a n()&8+\h)~8L`8"%x~.<&ޥodhd_Mj/G症5\QxzPהجkW?״W蚫(5^o~f?[w/'_N&gωŹt:Jw3s"ף6U? n? ^cj3ǨPؒB5o¦4xzyB_zwoߏ;#.>z{RXr,lں lEUׯZUS}Ӫu5{=uu5GIvI{O0ӳ:dU{8 5t؋"VNh9;kUyv]`M>O}dRV> x'NeRcѤbsu5( DDvSC9\#C4 7Vt@h7yl<-F#<>F2a c2 RTJVMOtr$xсa'T# A"i(i#*Cʒ[1>K4NЙ ےbY3s1b >6W>DT\CCNYaPۛՃy.x xVcR7@,.Z4WE1$4A㓪KW7C]s %Scfnʼwg)70.OGO3뺪$~/_q'07bB?Z5 ANh-1;,Pb 9~X2I]ZOowϿ%f9uS-[ܼ'B?.IZ˭͟67XW9 '`YΝAu; JkܘԺ|fϏWܹ*Ȏ<j {&.qggbm$Ia?,@{p;,vpbAT(c:([vB+m[t"lUiV#'6u:$r܋Q(?JDa[X媫Do❚e8Z3Eo1hHΣ) =&Jh00"³DbˏK]Z`h"蕈|!gE=:$A&T7Yњ\|}YHEfR^8]Nnh}=QYxsa4$4 \LRy9G!EzFȻpO3^^zvy;.5PVBbIKђhL* A%N0a#yj"?B+Gos`R2\w)t,g̖TVYøgLsE$Rh^_OPwևbώ5CC&$PT]\d%T#vr0}aP\t*IJң&e$N.yc냎MY:]z*TVB*3?=hO1.+)q_󫫕iǏpy?N$A%ΖJ!M1}W=ՏutδR7Z'LO ˢ^k2t\YԻ1 RI9E', N`~?;B~1ԘcؑX%uNqpfNVi1._i6O~݈`Ԯq QŘ+bݑ v`In?yp]h̳s@%puخ5{4XI&$Ni`T=/{UJFRw3|'5pY{-x3]m'Î<,e]K*^Q u7Ɨ 拕 %?sNO LoK.iD'^xi†@ QPC_IK1Ut@a Ж&,M,N+F/ .g_,5Af#X:kY+.B,`sʑA,e'&b1w \\}e[6IJ+}w3S80Q-U+“U\4,VϺ/)m%įTBl5V/7pC>_N# |gPNbE,d d~aw}Nz؝NRNbt7+9 h+֕~"-DU.*o *:BYtJޅ$T)Df^;YΕ!nT Q>Pm|.blnSI^5ٷs}:v?h}`Vi7ALëW%ZbR)ͧtsxsGɜT\dNHP*T}&[K֒+*/73˘&|*msd:L&%,kTe)s+.Q8eBqY2k)U,jKHqQhD[m:*EA,jOJɢyi.|d!hu /M}/3~ p>|C|U~IۅY4]`:S gSD8yB+4VJ#:+Pc" QC*%":eA4)m2]ƾ">KëjU?=·]cb6&>trjC&ȃJtD$(R3l6VD(q.0B]wсS%XQHoѩw9 "d3z0sTx}e6g}~\Dy)V w,_Y 7P,֗)ɉSz;ϯ&?\#މf./X_{gGkŠ 4@ShjMmjMś@Shj-046"@ShjMm 4y@ShjMm 46@ShjMm 46@ShjMm 46ڒFShjmi 46@ShjMm |7Єijz 46@ShjMmDɦ6РjShjMm 46Ĭ1Z7?@ShjXShjMm 46}qQZF je6XpK)JYJΔ. |&P(SAueYm:#*6,f۾v,8xBn 2뻶.(]C1 @㇞_-1ح`IڪQ]w :\T4还 _.-{w+Uy5(=xm7wy~#o =z2G<1F= L)&.p6Y~ HgoEc}[73{7hTwYӔVtώK} L01,dS/$29RѰ(2IL-vَ}J}ߏuWRGaF+yc4ܘS>gʹ%>\ lpe|1TuTr~8s:&.V o =QrÅJA` VaG^&X׏S(Op5;9yXJl Ч>#=Y҈yeyj"D?ZXĽRڬ=8!pQْ&jI)uK1hl^ms8b,mPSլبm+GmۢvC{-^ e:I vFb.nǬS6hHYPED-9E`HbȄ A0S Y#(#j4vj9{R9+d"b-86"ѵ"gBw1;1 >;rr)Y'LټaK_ycHƨ5d$%΄Q!~#1i.d, n'jMgDҭ"SZf%GEEZ\lqqţ 8"<@Ise$CtHT|B?r6#:fxiqN\ZgU4HNovܖ#ns7EyK&f!L^J4QK̭މL*qHfG1|.No%c{0^zUȍQ"Gi,72l,H/uREg 'G@=$p{?U0qVftjŶOudBdd&O1Ye܈+;gX\?T˥~%~z &NEwR~gQ}h3Dʢ @0~v΍D<۳Œ3nضi5C/ɲrtD"c<2 e"3n}7tK2/[+Ve>~{wu2 Of_zo'W$U+a>ӽnQ_:Yr=SSrN?гd\vzˑ9 {OV5_j˿fy1|&8r2`,]bum $djY%\ŝgJ{8_& !X$sk}J)&)R3S\(Ph;nz D%cq.[f&hŝO6*t.Y}4+>rW! 0n@`]t UT,lԑĽ$v =)9'ĥE:HI ZQ2~+ϭ$]c G6VOzpDȃ I$ сV82j)Vy`֧ *@-4\q蘲^{Q_Ɖ)ɉ/߉-c۝#3Ϯ5>Bu_3]+kn.6+0᎔u)l_Y յꮏrSzޫg#y%U<{M(rB/Z04&aXaᬓDWY6z J:ye?恠*٤\7GQͲ&vV=XĬUFӬzLqd/ "cm]Ng? өĝZ2j t_U?pŔsv2-X}_Y=*=(+˾nJvhآGBG !\EAk$(@ #)6gF H>$d4 Yh~]mC5a3ٵ K0}vO,am#!-.1~;Tg0N6M% s 0(Eb!H"mCb= yhy0!'uJH{Ap,)S ( fgZ1G EmRzudXDc:+&_KƦzvӲ:&ӎ9'.iP`k%h޴lA%_QA''}4"UN-gzϓG\>Y9(ĒuEzARJ=8ld $wb)'e;]H~>q"/ۊF6q"y-+Ǵ1Al9 b@,X;X8S+ ן^_.Q N9D2 0ACRWN!#ҍ0[&$pzÈF[eST&x- e;q {~MgT]W'gZTP?;y|N9V9%lo1LgDD9Ėuh"wɧGBhko2n@67_/Z#,y=YT*s'[<*/?|k_b<)zޏ1Uh: >4_`qC}U_.|9 iŒ;h@=X6ƴ4z#Ա3ؽJ'yc'9U*.TH5X1J#+̥n醃5O3I mUU >93fk539F܇}Y .+=Z/)ko @D:#й^ 57f X׷T6'bΎ(cFl#Az#,Uh(7uQ<(?_^f[y雹걡0! {XUn zVgk̅C44bgu/!" 'Ex؅Y`Fo(+n@T4)grf68s= Yg\D*3݂ͅەʽZn8_Uxkz0bB NU|[UlMU!jQ_+`9#^2 gjqT))Ios@qYQl*}B]w78 gWw*+9/ʓ$~+3O\yǏ' yxqgoK5rlnm]MZϳ,Xž+KGĤp_ϫ:%#M8ʃH3#|'yg\?'ԸrggYFG,,{Zm]nqdտTɯh9tz#J,fTY{fі>n ;+ LZ5<\N/ Zm'iΠǼUdFE"57X`.M;ZiM[Q6+kFK%H`(5C#fЪ6ESfɟ1-^}$9żq{wa iIP5Fq!"Esʅj<ȡ*9xNsi"F1P>P!t{lPqktP qĕOIhFH|ABS݈QrV?4řMLgY"h!#AL)C̉Gf1+y都4?A3[b,s͙6J-f>ByDRzR\3Q'v+]qAi5vtJ aOT90 |˄[OjݬڡŊz|YegVFrÉB9c{qqxꈹ)hP( `rV8 *P㛽HB<UXrR띱Y0{-Sv%MPHVHKDøK%\YÛ"j,XxaR7cKO䭭Yx^-eHlYūn3uu ?}Va+tk$7N̸fLBRbTcS$6 U׋]wz|NϳMpYLhafŶѬQy$Sg R}տ> j-r\J4w./B&x0z[fNb餪n'}e庥o`~&Rrk)Ӫ [E nX/r铟3Z_b}3`[':f_ߺ_&u >'wK*[t] F9̆Z~lI`\.ûT)^׸'v\PԪU4#2πNN})o':r1&By`,ơ0\D-ز1N !#2Z爙 x"^"pD yУR-zXwFŜex0}*Mah(7:HZŒC4I4OOD%% 2^"(Ҽ"-@N"D*@8 $#(`)eT~=j n ON勈KP.gpYJv@E Gx(>*A:GA2T!\~dte opx92rPa1U& f*yn c#E?*8)8cFqtzc+41SE F-kPOGݜ&!YBnAecx@ TBkFw5᰻?ie].Y!,^!bU0cL"ȡ8bEY?l -#e 'ƃjǹ2D.(CS0$ź1Jq.M&"q.Wv:|NsOz-r=?nwˣ 9n@?ʵ0 צba1+<6v"E$ɥ:qtuA tbhfȤB"?8.+"ё&XJj(C rF@6p`##e )z}&P)A-+#L1%P"qsdYX'WM|һ 14F@]510T׃lʖ8yxQT_Y.?a|e},տڟgwM ;%!{?^=NMZWAh;l^ ը d aPN99Ws<OP^ľfkI Wl(D5uVt8!=1[f*I>|v:!VP+,~;4}_#nO/Ք7~MMt;=M޿jhkd>QU2z{5Ux6L͍דngz_s.H?]O#/.`>8Yגk[uͰhEby`(|#;[nvMԊ{teq=U}l~h᧯hzR_Fhuҗ O*F6H'Ip ? =)'1"#&*N%Vk%+m'=a&{8DÁjώS+(P# Xie74pzr^CMB ѯSį3뫃 b*E;1[ik$-cW΁Ȯ51\PJp_"NnMNJ7'U8nɖ_}a&xȺY"} )t37YuB%x6S' T/y{ɜUax`=T}:z-Hym_޾t|/vb7o.7Ջ ]_MᴷWB_T%ȳu ^C6|dW=]WHZ%Qg0xXN^Uf[D7cOq4e.-5H{HES6]Zmi2Zj3m^]nSw?nQR\C"ʕYH%@[Vy.7/Iwd7.(D>bUfU.ѫ8;5 QxB )Y\r @O!. C;ӌN})_r:-J(fgB0xGr4q9J (h ECqlPBq[HeM<(ÁDVHuQC¡8ź1lSvB z`^f"yYSI;c҃% ^s,TӔ/J/$۬L>(/S1dTҜ IڄS|yP+R!2ԜsW51RHuv8ֽzˏxx{dw<o艝_lH두fDۡzWLaq^⨆@\XA Ӛ#".~$'b= {h{0e'@sJ 611mA /<$TNgENqkrO6\8scc I^Iψ\=1cUSRJFvӲ:$ӎii1Rd{ ֬1%?{${M}QZ=^^iˡ&+i|6+oK4?)ᦓʟ8>kg({kukWicV ɺۂݲ|&=fu(ߔcmp8m l8VgϖY9/e KtG{CсKBSoړx=^Q(YUo!љd]_Sg)sxs%|/Kl;;-bbiAJyB.($O䒬HєjЂJrě󀔘ӥ ے~OtiI{Gp"&,zgw65TRף5aՋ!32$'\'<'H26)@sAѩ?J7zEffj{ȑ_a:p;Xw؝3HVFINbK%[/mDQSlvX|X$w׷v4ȺKVSdp$씶.uHy!pV)m"dk@6#]lw%wJ1Ϸ)-)3hLp<6m FGm |8yXNWyBԕ4+; $u(tWP{dP#,K=ǢxExѺ4J+(RCv7kDЎ尷{D,GsB9 1'9\CIsA)@''ue_wZĀhtG\RY\dm2@ ^Gc0qgti'6́Vh6iKeoƂitQfPk;YBvC>Lg)I> @^l{^ݻ_>m}kshqz2* {7}Cà?)=oZ7Ժ}Һ=tOmv[2˸mRˆZ6 i;/Zb=-n3wy4 \tr-:bRཧ^,_6'Ek󜶻0OݟvP]zls+JHKj?Rr[ߦ*0>UwxUgB[/xZ:y-ii ҿA 9mѿ^3Lg-K˷iJvylW];# z0M0BƍEΑ$\$kUU&&͖^#E#Y_W bx>L2vLB:'/mb?}rqV3*@h 8T>sN.qHhuq+lx(dv٪zzBJXK֍ANl @VGt٫bhω^ԪsO0,r~WA~{B%I|Ș,Dv /ȖDZ!i*W}US/͋uZlY]G PоA= ն..C|^^4z]Xĝ}jVYiҦJչO s[Βlp=*A8@od|u2dec~p7+% ?tɰEYe%L~M`xr:vƟI=D _]/|ßߥuP;_+Ǣ5HT/'WwS]^qM[=`ٙ9< j=#x5eRfn_˰w;J'ٜ5жX{F'-{W 4״oO<(01WCqr'] Nj2E#ieL :t@̞Ů<"p1gţe;eŠ/=4zhonsR{{Bb䞡[ 1+&AwkjÈo{OK""*U8s3'LT٦ĝ輳Ql$3BĨsUܤ,g66A(")7oXv5Xsi.m~l_8PDlX:h;TrʈvZIW=#Uͼ =hRh>^|XDi!4uh/REעmvd#%C-;d"AxML;,Yd`c>54D.Ɛ/^dF֥9ܹ-"G e7G!5A!&LPx-2ĭ6.^>!~!ƃWKXŵ,ǂgvZ'kVof1˙"S3#XX6<:4˰+BJe^QJ''@AQ\Eae2V~JY_.A͂vLRq?s8q3DX߁uC̗{|BK;GApHG3e GO ܋y-/ ?s'"8<2x)EK0T΀ב[@[!c3֓t&| OR-.(`8. /p4Q `kzɭ˷M|no~4!JPҼ-)ʧٷVV| @uJ{HBJ / ڤmic0k%Ig7FaM"43az =ru/JCnxP hIPEB3F%s13\Ȃ%HXCBX%td|b*vhMzQ[|Ƙ6<+726#m*! Yf JZA^H( pzV:w޾;KoOo]m|)ߴ2eb, o^Ⱦ:}4Vt<ĥeC^_K@R7ޜ( Q9x\F%曨o"&%\:|[z>k#4j0I hkcd9AҚΙ0Qgh齳2CG~y}/IPezvT<ً׶:={!a"" [ΑLx_#&О o\HɪbZbR<ԃR߽πG뢍ydz[%v(D,IK[T̠wU=1 c.:K=!=W hcTVNJj;U)H@̌6<\AH9Xe(CΆGhd \g|)V]̛ $}ּ| "*"B!Z\P gBI R0)~櫛S&'zʴS䦍ڬJӷdrR.2=#ӥho9xx|rj$$*tཔI -3! -(MPi5 IqTIi G嬂E0 Z%%RmXm85c=RMVNq÷낌p/n%Qn74Ny& gxfb,3ڛY ,Hʡ`heޒIGXjzYI]7L@0% (L`ɶcg<*or̒u!k gƎIǡZUڮ=)*DPxIa/$:3(@%$MׅBRt+ rEVtd.!LdT?'jٯ[~sVhZqF55bwgd>gKҔ=Ôl`V=-r-Q'Lsy˂P$ƌUdVl}U"=!&Р˔Ȓ`J%6[t\@Hzq,qJjRr^T^T׋^%hc #рiECBqd&Ȼ Z$Av׋ЋIǡWև0}xԼ2VGcWL4h ;я/(RU~\.pv BУB,|sҍ"5i%;ls"Âʞ-$7$h*U-ʺs-ᰇ?yZ[.} ylIe y6cFBUa/3dlj'4魀ȅ` dt>\pQJ$.z3 { c6=[rP:J2Ж(-[[)9$K-ӲqMm]L,I>8)(d JM ` Y8cI[PB$ii+:},tokz5Z椬V%*;˹cd@~&cc+PhGeDwW Ū^zv˘F.KAo<%p)Y(\V92Bfe}XT//"ӛQ. f<#s+m#9 _}T_gcb C O1E2$}=*RtbQ=)0vFq/ jErK!T"ƒ=(1D!$l?d`Lp4eAap4ٯ }&<˰$*OS}*|c㳚{TNs)пԒyv;yч53kjR θdbo>-%q0 ýa0{+;9-,)'+'GkFOMκQ|$F휫ͨ{k+_&XhU5d`𥊱KTC&xy{_z?/r09$p7 ݇沯1zhiiu^1|q]S(w"m@;*}y1NX{Gl9r4H돀la-ם 솪xZb!J_]IKc+;wI`;qR-NL\6kJDTTkl!Q7S^1}1N5|}IR{ݸ *5$L!վ 6,-ʪt,q_}y^MLS_2q"&um͋ԍe\wwʮs<[`ch%c W襎'h2V&4nO}G/;?6/aҞͧ=vǃhFi2u֐:t8]T 7~]hzX(OJX+XhaY 3]q&3/'H%g Ѻ\IE0eSL~cR(ω)ƹB\*HE"E`00Y>&7rNCB鼤k&q:tě9$sմ_W4s{vpwG"` ;GsO1%O2M>S{ySV=tǹ1Ѱl}Lig !éRJB 7V<i/n/顽w-[qעeN݈VJTr ˕8Ix# ϲW(H`$ BHnac|8IB.DVfnkf/{#g-Y&_DT/`~է!a7݇$UuYm>'}J A4E^Jhb 3M*xgYxyԵY@OC<}ϞQ{7g>Pս2Dgc6V-ny]'v%ƥTuzm֎?|87_u0Ǒn%ԞӼiT 4F@\cz &3PgUip.ȸ*;K_uɭE׽ƹTTVQQcܕ,mk!$iM>3A e*N"G'tHrR$NEoSrݒ|%]scsO! 'W(w\ljiDqچ&d뛠IUmǽu"H%qIqc%DEr%ɒPqfL ֙(z[LeSҼW/ƒm-Cۧ=U$]#,wnѲa ѻ`)Ll:q}17xLh5~=h2ΝX Nc>5TZ;Y/:7VH5'&Ĵ> psˮCcmAW 4NmV5Nmlܩ-/^y 1ܳ՝_9KUnk}wd >8Kx/߽,wێ18Y`!QJoJ(/9q*%ul$gۙ}/m’Z*,]H]Ω"q)\}HSA t'`IVAã1 U,TLAHmb3CIA2W2IlȩgƊsIwHHZAE!EE(% :)l]{M2^ WH40#\_nD1e Z !0ijI$ /xK àL 6ČV)S)֔d4UdX~l.Ukv$UC!->c$zڬHgYF8`zfo%5(cX2`̌wBZ͋/$9z/a1 FCXY*ZHi#/&exxv0`s pD@+'#X>%[SMP@t<[sH]piUW1@լ:UVb0 #H1i_+JмmpjD&@Hym$k"*B()$Jd $Zqap Q+cfXwęVIGnE톋rs5cV9I`1zH.+:q@6AnՄ0%4dg73CTJGES^ TJ WX gL7 ^E Zƒ=A&|Z-C`$ (gLtX2X~@d/ dKine㍀A8Kd2P?`y";*NL@l3#!.ˑ $ `|ZN]Gw~2f敱!r[| D+pnCz%7>8=. EqĆpTDw!2.؂ !JE58d2Kx{*FgGDKAFXŎ bQOBA:U(d&.b6TQy8zw(9hY ܔ@ 1.XI4D4NF`&y,( ::Ȁ`-bw7iUOby+ 6vt3Fe<8Ր7Ͱ1)9@(!,`*etGQz~pÂF;\ByB V.$AyK^h& R (X@PxYAv+!`Xm6V+ "&ꬍ .6L~I^pye% C,le1,ɅOBBUD CnzZWKbA[`?N,DhI2C>'21V\OvO@J:zDI<`hiqy Τ "NV +㕂 5Vb`A33ۍɕ7!M9RsUxHT vb.a;baP˂G-k|Aұ/DS>%(n04C cigm61Ruʳ.:8*E.5l`=Ѡf9xD!H<.xF&.$]S8&%;`]% ND4hoQ?LOKJ!Ou.jqp8Y@DVVr:*&_/Ɵ 8=zwhܟP9iR:LAS{S]^xX;z?€!`ţAr{,HBT TT@_#su=T @B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H/ 6FDŽ*>$+A T*C$׈\iH D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$׋%@%2 T]AI>Hg}H " H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@_1Ȁ 7 Th@@ TjH !4ӈB$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D}=H+XkK5{;z]R2ꦼ>-O ހq⸭e=Y#W@ Kb$].'A:dž.g/:5k,{W=p\ʣOb;c0A:ovjkB+#9ʭw b}`(:-ߜ"\('tmCx]5YMæ_(u+eHf!į 42+vNf}K98Wy1:-%mú5ưmz&.'/Erw8FtW0:"]8"7b4xGb$VVKQcTߊc!RcS cȦb}hΐ{[-⸜wuV,|eTsXYgUe]UfBBR`hzb|z. ZbS 1KN(Go'cڱcN / FN?̓@Ӭ41՚]Z=:-{8OcN:,|&m ~n3lǀS6K@ح E)SqCrP 68kW;ig흗V z+_2+pBRI k>` ~Vv|-4u"$sPM$[ wg:G2#b&|TFY6!nO >Œ\s)bG1ZxGOsń7+R~72ڕeo2|_7N\9 ~_zZY(J) q 6G!ȮJΖqI圢ThNw} ; '4;'~"ʠp8gŋ]t6S|;ohuO):_Fx+c=qCz ydҚ,iy1Ľ"&'ʼU[k4ȷ#o/ (.;&JxE+#Uhmp~ };-sSY}S'm^($RMJUXCּZ XA`RJ(NDҽƚN3]0C9gNS~f4Q[7 ֜-c1U2uLSS+ŎNI C[!NҔi핰"@<Wb1r SA`9 jz[.FQ!b<ŧ[lS\Yd%&B; K+AE_o w]SSYպ|ۣ"/믜޻M%ge|V~Sc`Ίxr --6A03.0ud B#UMIb}jȹ^3vt ;CuXjԅw M.ge٤~8L>Ps? hI48Dt`$)CpBix4eeA?HRҡ%f6 163a*UA5vg\K!CڝqǡZt jm`wrp^TLY^&riRV3U0Q4۴ɻ躭Ri|i2xT~|b"ALP8 ,A-ubn;#z}F?ՈchQ#F܍:9¬o 4@iȖtiG˹ .RuB$1ݖ|5bgQ|`=q5KԋtAzq'P)**&%JkrJh'Y`pԋŇ;C!XyKj76l8jfuQc|zq-+{ll("= CNΘ+n_3J +AY\2tiva{ݕÊ>[ImIz@%n׌S[a+~2Sjbtp`ӥG-d89Ӕ8wvFΞUsm`*z 8RBI?T`R`?[6mg킾5Meq$÷'[e%+ 53=,"sIY$$f33J9˃QXN deۤawJ 00TO/UܷW4r΀N!i/-L9#b+3Iͅ@/M}?iG`s"&Q+%Z9"1FIO% RE΢̶Ǔ u'H%s99Y*].gsr'ϏV{EY9X!ɻT`RML.Tɍ`BI[oO =7yb&MwS5d:K0OgDOL/N/U7~Cd0>{=$7LFKgdtpږMoeLӧ{%lTw~lnǴ7lWT xu~_^|Q~?ً{~g30IJ5 ݎ{ |)٧|Zu9]吷|H&퇹mm7_.?>q(U{f{ͮGlYXšS=W^1?o{a֨fNU81oCl+m'TՍ쌑P=N#'$F&ͩ1 !**5L =I.9qgO8T^Hϟ1Ae㆕ F Zb-I1qfid d_g&+8I}UCwwx \TvmYtVaXcV}a//*#v8.< ;ۡv gk`>[6iݏ ;lJ9_L +7yZ jyցt"cŌb_yYE2U ^Xtgsq>g;!d~NH`l̸(eTAduI0tm `s.E޵q$eOcqq|cw#I$CR U RCǁcUw=dZGEdQ0A[SHlntƆjQ SqVNip`94mywI>:pڡJ"M`+&U*|pcz!L{)@|x&j w8g9)*H,i~S~܉);VlOQq;yh)݈pAF0( 8#OrQHԃ1",D2""&ZH0RH7N+5.  "XtȰF"II=VVKs*x]})8%STH X*(b ̅vy0[&~pPWЖg_/?;+Q{Gut M([ f)lzvKkv}vUTMmP4`Ni{^ w6o>V\eWzj3ͪӬLz {:Y)ʨ,2h:́<7}7ET/0APx֞&omwDEhSnSМsm%`! h g-{0Kg[)[rS\sDkRб9$=\rl>0:K!@|쵲2aDD&p@jnDh:vF&x$9qv>rڿ-Z&䵓\B N︵AΕ-Bߴ97QJc1F1s-L)۫‡ኖ=, ,6ZYTS'4b5fCuHJ_x8Գqk\aqN{`>fyP <(n 9tmT-:z!tĹwڡS+|7 ϮHs2(؜9oQZx_B۳;G٥ó+hESL3"XGTq"oiV\c'9 ÞֳkR5vA;M^m,&ܦ(K+Z,K!Brʉim1A8I] 3JHQN"xXE rl-cRGa9ZdI #>&cZJ@)<,-vY 4@o, x$ y:zgcL"^ˈiĔAhnD4;NM˱?")%[3Jnnlk6Pc&]^]Lo-Ρu1:/[7͆c]~5e-'Uwny=~;rʂJ-`} 0T3C.(i%^$BоV0pch \éSbQ(4Ny,,H`Xg f^Z^Dz~N%|꣼D ywU$[[!;#JǕ N-ӡ'ZG!:x-H`$yuHm1:wA$<p|[ѳR竳bЍM'!.%(`߈&J@րCPQ$(V4&BV=_XrHc$-5Ǜ"˸J" Q"@*8u*+@V]?;Xo7ZسuG!M 5muN\|!ԄOj xP}b\21ACRk2"QyϜQohk9jf7"wجY[v\}<)BC4w7i*ۙvdwĝ3OJy2afYo7]0W٤3WxO7?# [ цeH*}ĉ'? GQ5H5+(dq<4xnn j\$bB%g"ӃU)Dj& #i/޼{ x:%VYz b<(d?>r,˫^V\vD^T 2}8W$fm;~dANGܘ1skK޽Q ϓ7u;?j7r44빟gm*,Ip/ F~HL" h>'T9Ϣ@yWwiC\?XU^a&g&~FEec1f M&lC<tԣb8*y/Ќ_l?2W}=Z9E1,{֦HyHU$4dՊũ5 =VlRE0%<Ԍt54{2gЧA4qўzB \:IkX6iBjGyVsaAkPKn{ȁ<<k"Ӿϭv0jsy|1xMӰ }&i1)k̃^%HJ8^Q RAo`KiIYWȸGC΢{0Ķ6˾*i3v[E;)ވB#'*:$^z "nJ!RM6ar 8M(rnS51Ew9 ZJKƦOcf :$<5ZR2#C_z3nqD[cB&!& 19GR1s*?Ӭ\n/~xGpϒ̠B:޹,aؿ2*&閺߿910 f1grH8+Z2ɾ*#5QJ~36f$糬M!灂Ye9!#=3`L,a?}»l9{n~dLgocRK쑋^ӝ]wa鞩Y#$~Eخa ,r!\0@̱|<kK : dj.+`LV7krTP+R3Ey21,1,[KE>ג)E 0ßEd -C:uJN-<kd.i$  -',ӱXM>-&%k<^}yAe"m7񚛮t,#QDTllz&ɼ"Ry2UbbG#'YSY"^z7 LG%Nn 'k~Yv!vTV")#kbV)eEO#nEdF[(uOTO|ZLLr)B~]}u!>ռ8Gŷ ?B ]S3?Ed%jOlGNڙagBuIjw#i\]$΋ҵ8QXV8H]$uqlΊ9wMހN w9m]7y%-d>os0WH[BIFφ3 e}{x)C#^U$!ljLB\ҨU5}3m)_E.˫INpx?lvOQU^H\Qy}dÈZ {ro"PV^ d1XuYVrLnhnѸ?j! MP;wf`\wƛ%*nX N^*hі^XlGBo1ŵedXqBe*D .D(lyDkþA2ywX8n\ +rm˟^SSn d;qQCYed:~=rRU X( ǞX,f";P eHjw{w:x! GF] (}^9NE!i/Nz$Y2d]⚦كB>sysIQU-wTM%s$+f[t!(sRBL86}d s]RwLMZ%GYGW( c"_'LH/"MY voip xDD_xH,&զlR\=@@IcOI=k],z5B)Ȃ14Jt)*k_cF>{/.Ic `֑BvP=5 }m.5ˎaJ$z-/VQOq SiE +1ٽ*؀d@> `-iЮltF*A5V-P|պ* %~, ɠ/C,8Flu+yXYg0Ь2vި^1Ls yYP`0XpS¤zy}֡st1TUzջVLW:zRLvcs1N[5mB7JFˌ  hBC'sd|2&d]g M+(ӉPu˒M%IcAϋuH+j+;.jPՀB+~Xd܀AS@d)PioX,50LbDb0ciƗI1ά :ŴM9 y+XGeұK \Vb8%a7i,X%q#kGRP6Z'jC Bꕶ#r{&S^~ϊJ "f_R](=iD^n!GA]`M6y-Qd ՜&TY\d`{2 {xWZE5p4 d36qrE뗣u[kwᗢH+fD' G1!EIJ!䄈6!2*ya]J ߎ]-zDw!t$MY ǀ0RP8t>o9@VZLAG= ]I4Z*#``!&SAyCs ~XA̬ƂBgs՜P$rt-H **2MʴJ5s4o2AcJA&Э.4(28`G!cHp+KP-WwV4<ܶMR xY+N";>m`pۿmؾJ d}E XVGK 0"(Qw@ҧtpC*Kt]H_>1&!Gu:SAʀv PJZ%l-ѧFk.IŽn:L>^B ;e!3ڄ`1ctJ߳< (Lg$B25;5vEQpYLMF"h4 <(@`ezAUU% ¬0,,H1dlD(! c#(jS,КϽ_vXI+,Iƒ5$RYIe6HVӥ*^`~sGͽZF7+mUh8X&ڦ{ů%U Sa![[T`M.O1ڹ<6/0Άvaܦ]-ms1df`c;V$FOB=V`&C˿4m'QEŬ׶aXSQךBr$JVO ]3yȯgFo9+̘. cA5(QJj *Pd!5@J̨IO!Ƞ rw‱oުGdR6v"+TOc7 ҍ("Ndviknw(._*Iv("ZT1"&7cbw ;]CTXd أt%h#*`hAgp `NmS; "7 3h҂cZ b $_TL(X]W-EB9y9=NKyנB]hDoZajlJ؝5F-6R FxX@Vp*#-]A-l1EO zȕIѾO30%7aFF,Sat9 L03iҭ ?H`"Nm곩\4rUf\. LDC1 4&IDYIFi:ub鹋hI5F,?u 358 N} Հ?,r7/[7| Eu0!fNCzGށ^x<0@pr@Q(C bkc%5(`@=+E%؃b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V}Jg>%P_(V}J J@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X )?Z{hW|󂎚R@Kku9wR׋@@+gdd5/ߎu$Yd\/ꋓl9YE;zƥrZ?n>6gB`1 kCu ~spZȝanQ(_NV./ߜ)zW*(qި)$~g;#(Ф)"qoϳ,r#9r1{>Qֳj w)VKKhyD^LO8.|:g%ڲmr9NrNQMn$\؂-~:[eV+ՃH 'guzAAzLv lmU쵖3d\ѥo7޺R@`Nz\ {ϗ_8 ~SϷauWvj#nst]=`>}OLiy?-964uGٰw EKp֬ bVDwn⸽*lTfS{eĨ5N1lV[I$3zo ;7nh]qET>W ۿo룞,ϯ/o Ǵ>ݵ3Sծ/w<~5Py\4&]/W/naQ:ZvVZ*f4ch~{c\Vgozm߆u+tT7@Wm(1(F閌* ?ع~fq9" zK'*˫7[#F7+OН.Jt̶}jtoͅnr4A lHUgie! ݋U t.ɉ^;nal<|͜|/ j\ qS`f%p3V?C!^i6d_ܵTE #3G޸B7.H<)I`kWѱ] d{  KKN[h:y7.xیs6s7kزX].Ϧ`Y l]; = q[ܝ}͉끌gWhiz[r=Qw6ieydpnZ}V/`-0vՀ<=6OnùXita&s~ և7{hxkyƷg'n=C|L mGrjӉC{Mktd~}xf̧ʝ?wdbn?7 1;7_Y܊qe[3S-1@&Vgn@d~N_ۅt{ڴl5焅뇖kB:ߤlpUs Qd].^WQ2֍91pb AJ]to6du #O%T鲦BA䒓4;]z#Ĭa wfL s<1km#9_ev~?,^eD)L\-/ݯz$%qHjI=6`KtuOU1`w+}?$.w/?pvz7E] aRot3܊#<$S iz똑ڐy).ڟI"{œGK߫Nk/!@76yB\ KHM+ %5:rpt'IpݸaC^S/9>{HnCplt*۬)p!*b+Jj"HHT"k0ftE/٤~XCXoC m#A(e7 lC|v'2^ʡē77\hξp5.Y*Ht7)dGR(fԼgNKʨ7kmƫ0Ly=ꃳ\}r:gKpܵܝNEYb<²x8p7+& fC_Eߧf*Ō& ~jŠ j}S\6}ĉ'? ⃙44)8hЎo3׳P Ul>LV[7+HrUr׭Bs[ &2#lHՖ۷m5Zyy[bR4ÿG$,Gaī,*Ϋu!JzVZUDσv=9)=>e T>fBv.Y 'd1<Ř7Oc3rV6?O'ſ~yrDM䇱ߙ{,~82mG~v?Iy 0p4^ɛz-V/?&%3a=^yݼسiny'=i[ !L' v }D8lMG^lpqiSxPZj6neX j- |X|`@/w7kܻrchFJ*v>U?X]0O\ ;߽)ƗװoϞ4 38w$4&j(wgay♶ G"IjlԐ)gܮ{fO+j6i2 =47;x0uCYvvy٥$NLFkLp` !w9\ yl{Z8gЗx,Wx>g1N̜EVLJ;j=Eb !@.Ҡf |(ā.^گtK;ori]tť}/joq4۫aF\xs;RʅO3_&#Z 9˞])]w]qe˴JoJ)N\e2;e}2%C/[vxTFkg\i&bZV5 AIMރ1%kKȳ(39;XhӃ q%$Q2p%l)UsUCsyP2]zMg]Nl#i<Z\4$!ID a4u\za `l9؊a^&P.*\VT )E RrHd 1";KI% B' Ll?8*C%Fn6}Q 쥅u'(P^EJW\1Jm*I~1|HX/A|ϵ5"Hc@*t)=֘1 28\\kKrmW$j"]^bqa- IΑ_?U0{E;Ye7!%ƕYi'7BϖcMSsT6NkιJu<40xңS8|p1]<#xrK`?y}ͫW߿9D~wz+PQ' $8#7N}hLs CSvZu໌sNc+v{k%@?^|5\fi֪Stز5󟐎AJ\Y5@ZP|B`@Eq'ztMQJ 1#fMBM`_2Mv3uNH@}1P›q|ݸ(0n4޹H*u S$a\[34ba`2rƼրdbڴ:B js=XmzYb9@T vY4Ұ0)lOjRzX^s%@cQu[mzm ˲+*l(}29ܗB‡ o"Wz{wzha/bCmNQսk2-Hs^yT̈0\l lu5]R`'T^OO'3 :THCKK)FDOAsCLPj]jBM޹πO\ZL|2 "0*1p$VET!",+ rfCjeYJ΂NVK}3Vn˅.~~w W]kF}z^LHǬ+ Eg̀*˨ 1`Y>6Kݹ;nfStTY^Jls ޲t޾y!FRM7 0M\Tjc\`X2tie$XK:_ϸ>_q)}~dDʗUQRbāZ 1&+j$#?ҏq_Guy;O8sc}=JбtuP<ͲbvBt_i| -̰QY]w9Jtɬc=Q z6V63= m TVZEiKFR\QOGaD".)b!!!DCU 30{-#ML9@Ғa1l{>&H@ASӹP_GG6~+<RH) DJ+&F{k+̫B&>7\_j au(qCvYI(J1,,VJ\fAFvA xSa*brs(Qd6 :0Ḳa6rʨocΒb<?Ո:FԽF5欃AD[@@Q;](8' $X5o4N1D0XBW=?$8j: @^2:łK/K0I޵q,ٿ2 H~pa,Y'~ ~J)!)r3CEjH )dNէN3ܚ\ 0$1:"CEZō\Fbl+sHH| i!Nzx4LMIǮ7np*l|]4\{d[vhɑdHeEY#~|G{Kt]rAΕ-r.rD$-6J!۠LnLti O":,?ZB|^hgQaNbЈ,`z$s4S$6A/'nqU!SV0n~R[Bf&yuJ{H6ᲀbvsl#,5"O J2L^&nu(ڜ-#'#}Q˂46دLMFZn'-ٶH뉐ŌziIn0.a_59VD)  Kr_+L$O'r0!'uJH{Ap,)Q7 1M>͚x_\}Mo-y"ä" 򅫳fD do!L{)@||G&h;3Yb a$޶@h?\$7 eEӈ퉋d4 >Et dd,9'.8ODanafIhdʅF1"aR*Ljd<)VcL^jʈhA #(J4)Hn#6֜\$ck\>?D仟Oa>? LfuwNg>\j4N chj k1GB;  ƭKxù_9LЮPK w,!?ҥz#lsR^Ynim']X8W]wHaB& |MF#_eaUMMYǺ 9[(@5/ MZ[= :ez=!LoƃI(oҟt36 !f1+I0w g7 Û.OX/Ҹ]P/߾~YޭY5ŗYFH|',Q}~-%b<33u&{Ze҄23S uw3i~gv~ONZE>ג)E 07JI唞Mgt +1+>j)+)b-ň>f0f'Qׯ;Y;_v̨Wpׅ.|C#;4l:/tVD55HiENl%J㳩SԊT,V75U锚ۢ/[I, 50ïr:7o+'k~yư4Anx H`$;hLq%(O`pNA\nV.S:._Eu,^ZqPiIb&r8 M`_%Zi},>ź4#νT+㲗ºLZc.N6 B:rTHKC&Wt\S Æ7Lz PW"&Ozp[̾B'W˯TgqQ:{+m?B%gٛRu,za-ZZg3p)Z=";h"c`O{Ug o%ĝjwBc }^>aJ'[H}'U 1JEkJwJ# IgNKʨ7X[rDRj5%-~> ~Ռfmo5S{:6;[xYfERETxfIyCW/HW0}d6B#gT`"y-+Ǵ1`<3xc:˱^s{,O[҅uSVη;Nnw|; Pz1IJPZ%kZ,%kZ,%kZ,mW-[,%kZ,%kZ,%kM}0$G~seK.r]6J!۠LnL )nN# : ,)ZB|^hgQaNbЈmԘE iGR:A;3EbFͩX!Gƻ0- Ax`^Ek `=8E$=9Q:_\v #U-2j5٧$Hok e&||{hR?03lS|>@@}Y SY|7VYX;W'竰TCmôUdS(v#o Zzn˹en00X$xm_upݝkᄾF>l[DHEv^ȵV o#0͌33+]n#E9t/RS9:' 9sqDA(-aĘHpcF;'@%y-,PAjn'C0!B*Caj3jX$"﵌FMFS0OHKD>ƚS: :"6oG:~>W8٬ 0]z^9iMn'p޸F ?VUWpiQ'!nƦM Ah]>2 m:NMΠń{]tyYzs0l{vhBJ整֯v~V]+hjԼPr< ͞nԮ]C2?ć~iyV?Pqu~^GJcxYgE7W!O迷m |1#ԎYZlsN$Oj Lt S" Nap8W#Vx&D~bS:&E1[`,mNW-f6ڋsٯ?\.;erCAAU>z pV!D INea/!jKZp8ŀ1u4)X 1(H<rNrH3/-2&fZFH.Vܯ;]yiJOɻe %oQrWV/85C+O3Ct[>HLy똑cta`/’V4xoUZj/!@76yB\ KPNM+ fIMY KQ"vi r%UA)''5Xz/ FK̓ &E2h"HH: k0ftEvѯYb͚? =Ԩ@QӖ]>YOF6PiW>Tqg_@^4O}qޖ唱UL?EW[M:nwf7$E0Oʐ 0 w =wSIw!7x)n <(Y`~q 5z7 (.s:}ĉ7 WW5IAmpf޴L<7bKz!ՂqPKDf-/8!RS:[U¶bkRK[Z/J ܅grFL(X[V^\]8 v3f0V.䚷 [./- |bL+n|KnQLoٯ~?Vi?'nh?M `fkXEOkfT| ǗzvW }3 Tmy~c{$]ؾ_pQm q}e)jo< .&3\F],5,z]sֻ1dJS\&~ a/{!ŖH{Vlj#)7'.Ϊ̷0W8sOg0WK~*&U,5T+ϥFԼ2(gT;ޤ*;(}s[3fVֲӬbXNh;z?v~8CY5 \r, V4FA!5C#ףDl6 ^Ws~16lBكBM;quG`Z@7 L')°p²6xO)/X%Nmi>WIر>okC`$Yk҉:F+(mhXt,z̝ʽ3ø)a CƚS'GdW5^fPm?NǛmz"%&'7%i͠u4 ūOWf7WWNgwg<6c'F*s4{d7]?ǻ3I p ۱--nUV15KM8z uD2a,Ds)GX5;CJb  2S~g!(8v0)&8 'aXs[uF$˿p4?ϳh8f/3(_}-2?&FN³"J﫯=R׳gښ8_AE' NϵGUzp;NU9k.="4:,. !h)Be]v|}?///WJe[H-STeս‘GUGf Gk9vU|kxxexߑ>^ovtG{vz _{9itC,]+ m/ũl5osğ¤&W tl0-FհVCM^MZ525.Ŭf O ~r9?hL__ig7< Faз- XTyX]kn/G@vUz[B\ OYff3:!{ gCfyn[/|"K!Y K*R |l6J'6l'RF팀=^[۔݊5c6j!__˝dSGZ*f>L슲vR~ZOJUlizRYn'1KM_37Oդ l>+&'aij6MXL/r]m(\a^ ώ<|>g:OkeYH0/gQqei[gKzm7Jǖ Qo⁢ֻRQt=^K"dsʐ 1\$)6&eP2@R㷦cX;`gECٱ `\,;TJja DZfOƝ )p91+V ==tȋ~C@RN'+E#l"\BpYV 1XJt%%P[&yTʽefv]7\ƌY `!Eafe)JIBŢJS&-b y`7/kqoSۆjTByo},2{Ǘ%v]O-H6\sMr,@(Ͼ5cD"x,ŤCHFZed*yɣ۔^KEAC톖톾j|n@VZVvRn]Sni.UNGW\6cd6)!,N*A}vL|;S"ʰz ɛ0 b$P4ɪBo:+r8l3}[5lS,:uo+z,Kt~R(NxGrk<,XmI7qb}vIJ36 QC^`zr}w<yYn ^I>xYڧ>=z BPI OiɟRάrҩDpd`P}>۴fmuoznՃ4Oσ4>22Cy[=߽o{wɻj%Ҿ1.}Zd{0߁u-f@q&ո*Hkf[l񢶀 5RCgT 秃Tw^ƣFSPMT24t-C^>]Ǣ#7>>?:F#!RaFAJ#, (Q1)RGMkB_zғ/g.ɮy{-ߛϾl&}ZuIGyQ 2%K:eu6R E6 N} ;OG˧Y/rkepsޱt6޾y&Fy2%b6ʲi7/5ѤASG'm|s SH $|MMFƤ%K og^q5YGJʤͪE@*3L˜ FyN?ӏ ~S7wt &<:n!o?OrncV=ԫwe/[.~5!/E,0k&g].(akot`̲Qتv }͵:th޽/2DEb8iuTV I 㔨U7,nQ H(c):8XfKJ5cg<*8kotᚢlˊD'[7Ef׳Sy4mw;"d`sDPRנdޱ'4Yȶ@E+mT2Uc/d)TRlKV3fN v}ړ$*"9NW2(>pY1$WNy'ԡ۰KFbd_,\[U=sLɒeJlIKTpSMppW쐸UG֋f)\KԋZtbxr8mҠl9d]nh<"/\+2/^F/>;Շб>z'VZϮgs:kG_Ek.ԦQ,j"=g?>SiڶG&HE&4djmc;ms(!=SvuMPm&[Z0E!Wk˕`L#"'/>&b|AhlyA4b?{wy'wBbŌEIShPZRtGe=? ^_8B BdlF3x !,0XSKg)Aԭ X,T)P\ae r>S+s{*E;S1Fb`eU"͎Ei̎E{ {Li\O~[{B g?0q^ǃI~UO)ӫX ,}?1w* /a6OVgZKa-yEU 5< ;r{%is8Xٔ |,֔ \'I*- ")NDDHޅH6B6b(3b2XZ|ҧ96Ƣ/)9TdQ׮.G%[[vB(6-4 Z̳g9`' C>2&DdqLZ #`F^-%Xp% 1a41ٻ6&WN6 aon IvC"}Z%He#}{RP43͚:ꮮL%e`+po0ٶOd9n3u3YH!IGvX콘z;{׽agIސ/B= ]4 YJj}{m`n`У̰`(ayۼHH=y^T&9|k ,JV/噃ZG;1~&F[S9;_LZ(?u:h # ]WSuI 8d \xH:SdL|wu)58`uRai2[u+Iνfð'&t$P&%bDꥦ`#&Z(J%2&"=`p~v֬#K68g'|1yt`&7%zΗnRg1~}y#YrOp&23*J$ :RK,)*Ncc&PfTetۨQ1p' z.7O@YFcn)~~na6m=K k#IkfYJQI"xXsy9nQ:"9Z"=s RShP( `rV: *PPܰM-?>f4@n-+9m/;k̽ 9֓NT_j݀.&w#KhǟZua:-D6u BZ96'nzQ^6x啖!9ڻucބ7xI<n|jV25Akonz3YhS^n;'owlsO3M$7$侞&;OWkn:cWRvlUDzOHøz|ayK-훈RT꣮` "9U*&q*$п1^Vx&nO!Q6Zgg5iP?vQ$%^L\ {Ʀ|ǯ?9erCA厩`} 0ɔ9C.(i6)bxŀ1u4)y Ny,,H@H3/-#BÞVǒtF3"5`pR"-s$#+sbZowf _KMHoVL-hqzzre5M,&W&[xuD݂Ab[njԆS{n҅%;3-ΎN|tcJKa )w8PQ$[Si}lRpФ} ΤBon~Oam˜uv|jw5M|5<_OFWVbF.:~,Hbp߹\iK)/GM+ZW75U'dzTe8X`F^;<j>\r7$h,Ӄ^4 .$vҼ&y`H%ErC,/ҮӁI^%IvIa[fnE2EєYǚ>u;D>u[i^t%f/WȪM&D/;Aje*G$fmnx3VkYzI}i$ǥ~~myԲ-cJ:BtU?, gS~}^lti]Ww(r:]Kj0gB>^ ˗xՕJ.cei\@/tY4a(yb4qqGK.O{!]jѥd57I&#ę/A&rݵ nҁ-H[X)ui]j]g{&hÙAQFV54GZӚꉹhKygԉ!wKw]ofu_ox6ɑHU%`% IBH; ЙA5(؀%v>j] }zK (.%,ҹXhV2ՠP)#OGRp5\HwJo8ŗV7TJ.TGNną[9)]㊣e]KKD)2GrYكB9yt|}|z麑^R_~tx|s@܁&#) g9HI\҃9H*{KUR>i:r2{[bፀ RTQzM잟͠)\3.QEL˗ cԊ:5&v7@Di0u}''˶:mM/_RjuS^FXrKaBBx4GF'|d*ړ`NGDͽ/iD0 7]TkYql6qG͛́zSG:ϙ$ *\t9`ˉ㍤?WI@g{D ׌qFIq>Oh^hsTMh\|J.&`8k,8@'kD1)|;__I>L/e]R;cJBM,{w>ja ?Rgxtl47p#T*H:қRy({l5CE^:칦

sIQaF%1FDB<@+(xFLs# B2$u dp\Ӓ@%?EVgߛ -_r^d+^eVK,W=K#<(0' A끹K=<)j My=_JԵGMI4Hb팂:X+<<^u+n",}p#_K?>'YpL =d[ŃI,·ʹ<@6mG%<0hs~{r)0S@f6 gBnxD}1E5Y/æ7f9+50Ua1|Ք+b)gtUu4ɍ\׽ٻmp:~+\o"7Gr3^ M)5󛽬wi'"~L"iKf/j Zv67YJ;HD 2vgK9\iilc3+dk˴!yY֜ C?%bZR[mnZܦg!rX$ N̠͒Rfʢ09Z  T U^%mIY&uui/B&# iQId r^jQG2gPX|)w޾ݝhf -6sk |7OߐOng idO Ϧ=}TǽAmU!\}Oq Nr : %0 pi;G=nuH!MA2Z[l#O]Hs&LeJ:g̱#?0V⾃~èk]eAkY׮K${>mW+|inꝝL&,`|>[Ӛj@kQRqMLWiՑdY1058sL[ǢC-hI vӵIi4 T~VHJbTZUDc -+jkݹ_Qޘ`i9\1ؗdjjj]ZsVL׷;,N29=or_IMBOܾwϭw 9)6d \ $(P̃&hkrTxkkc\!')Xebv(O ޤ9 ȓmZwalak+ wp3[Q 5;ܳMÄI''vC#[쀁3VN{tc^ KP 9~dUd{ISe(ȧ(DdEM@)1RfضxݚtlkmVvVCwJbY F N/C OZp%bmugD.NrS5).k.bgd0GGuBY;M,a>,43 Rxb;vq_akұ=-C=܁ +O"CqW?kr1f7,1C,I ~|rN|Yw-Q,$ϢX.Bյ2䧓oγUs*%f@GU.^j| 1oJ KcBᒧ -?9UҚvTC${'׼|:8oI Q ҈X"gaNe"( %/6kw&B/A]:at(+ ![*8kJē'EH ~&ܷ}Ӡ {ZY7d-6KAolUdԳYY#fG̊~ ЉW` ω}d5X"w\3{ HZr#QSvibNA=ed2vFq/ƅD}R DL(ƒ^XJmZ׼'>2Pw3а0^yg!a|P_%ps9=YyANME&#^ä}PJsɤOgWwO2X-q -pK@ 0LRrdΰq0<;H}Wϯ-%$Y98rcZ}9FL`Ʀ04ݍ4!pԄ-GE`ZϹ>Xj>7tQ|cHD3jy%zpz6:QeN]C?i8)Y%K(Hյ1fڄ~]y\x}>=*8X/AOfs+^ ~96Gߜrri [גVk[u͈X1Q4eJ>/=lw֛58ֶ*ݭ.յc5߲IϏq(>U^ g[v㡚dz*F'$޾G_^pą=z7G޽&i cmԃ:Ё[M_6o˶MK 4|7i׵95V.%qh{a6 $/dVhz͖U^]\b_n5rG4FH]*ƄXu{o.Fȝ>ҽ $[FO*I)%#HJx#Cs1qϝ@8':? W7u{~l܈B`x$N*ŗ(`, 6$nʪ@XBʾ$MG{ԖL[ٱ-2> ]U]Ӥ8i{f$WiZNǃˬ&+5xkc τ)`ƀޑ}7igON⸺JgC kĮ+|v#A;YᳫmiN!E(7 ԾOBuĊFy =eu`XϢƙxgjyApTH_r): ֥,JE- 1%,#'7(Ņ|,+&2p!k +Y _mmIj;nZ}egx=!M?qJ=>yvik+XzA@Ӡ6G}?H&qڨ$0&x C 9\-J+: |(_¾jvt*nLЍL(aD%)(E-\y! ')>^0Ó`Cj# (#Y(i*Cʒ[1>h3N "y+ßY3s-cֺ}Gᗋ}ji$2IHRY/V+\'8SV`6'nD-.hVxr`+ gzgIL<筇D] 9jYû^}{OScB w'C\Y_u3]w-JnWfGΆOmtX8y]x5kFfi9d&{4WuTp2vEn󳜗.ٙwݪ{GF۷ei5N߮_ۮۆGg%hMN>1 eNP'tʣa }Ƞ#mlCW[g.{.Aw6qj,w΅y"Fj3n~P+-jmƫk筭1Y_VvHB⏝- )1G.!1nLԘ1(Ȗ+ cvJȌ !u+r^w`,SC((8"}G_l={Tl?'HwqӠ#ŬlΧ a5/hi6 كy9R4Q??l\Zx\4OfT|0oo F3XzqT 1bFհTh9?7os}gz>Lf&%5PyS53.#7>mg:95beL,q In)!.Q޿ۏT O:aX,XO\ *ǃQ޵#?kp=`X`pMd6HrRL`&EGfL U ,ua ފހϦE1`6E]{ۮmvlPmXث>NӬuqP Cv7)Y@U,r<ru'>"h:R Ѩ- -%RZ;J14-h"woRQ ɕmD5GcVD:Q+;+ҞqgorlMm'Z*YԾeny;kU7ApnWU1/T43@nH7׋&Z6ήͧSu.ݤ3Of~2%?.ֻpN]y͊ ,epdw 7W&8.r7?d㯛^{wSgX?xR_G 6><]A _FܘD.1Dr&u5Nqv+FZOQ {ڇӮ7!Ks_YvQ2Uwȗ#"z鸶**E$t4"sAcGAK&là^N1POQ$e!%ah P幔LBwT2DVчϘ9 ]wN]yyJɇ=<ӋLn;Jw`=[ϗVxѷ$Fwss jj8fm׷ssFLp}$_p]\ PFyQГ?a%4͊0NՐHRK΋ 丢LSGQUA.X7Tp f:{]%݋wR{\b??#ktra1G(iB~8ݏE?uPyE#1> __6hos&mA[UoW(&*y8ג1^/?i IAnGq"΋1*$ecco q^ʚC0 'g퇿ԫ[?۾9Tt+=+_^%?t5 /VEÜ9JIa5<93\4-ioNϯ^*6э!&w3^g=RK}9|-$_F<- 9s9HH&ɯk{ЎfǑF4VhةUN{'cK0%Qp޻`l.W$69ftH=P,hUq zNHSV2A3;WzL?W-T[{M'6=بYmh 0uHh5A&/^50;^]6O;`,}|)奇K;yLcy2Q<?ľt[9g\c-9 Ֆ;4AKPM(U*i"[%d J\3uvH[/,'[h 4pzb F`}ux`ysIx X,!7pnv pkJI#2$PTQ3SQK,*rnSEe`tPxw{K"Ve?4n2n/~_Og}A zyۯ\c.h<直_~)(=Z~n~{{]9NQy]GյofʎYGbui].>y]US"vOYF7^@#j2~5!(]XFϋ^V+. jFM@7 'bpF)IEH?Sm̀KI?BtR^͸dpͨVԹ1||wcj EZ o5 _m#<ͶQ4(rEߌA[sSY\h5q7 "cMo-3;{tV b8g(t>|CB٪.S[KX# 8GyN.PGYe:Ę@ 3:9,r( qrU1F9\Ƥe Yy*S)ƢN2Dpb4;EQddl5B޻cLe[;V۱35w`2Cвdq Y;x)!E=K@R*\X C ځPs `V3S"EGQ*QFh%ʼnV{_Dy U^.%eKxv$)?KPDRBHZcLH{?L+MxEoyY.74[͉.ey(}x1D@Q9dU& @LC4;SXJ5K("+ %SCP SG88Qkɴښ\]2ULH=Z}G9A RdQð{RN#4#V*X|,!ɻD5h? , x4x,9a`s& A`BhXg9{Jy(MhhȠ'z޺X9ݯ=#EZk܃"'7Zé0,B44RhBF0Ia96^-%Cȧ98(AlBģ UYUTrMJ˗܏%/Ko ,ˠ=PsCcUvR`.&8ģ0oӀUy٬rĄŵ\̮ǩ4%%8L,RpR1;&">kOw7?v,n&ՂӰY[Kqbnmt<|Oxi}HsM `,~;4)|ԂE p"eVz8o|'z1͊%\o[n-G ?݌R>iVP^Ivy]Eu6r'&I}zhd?#9~h[>vvs%L[2('DmYF}Rjz0H1)[_f_?>v<uvi23W*JOJTOPI{~;GyOD#!!!2ǘ$*MD3H  k̽܋:q6= ',3wuuWeS_tڜ}AxqQq^!9Na&¡::YyJ&zfs:S ]m ~evݰty%Fq!t[˦{R_*7IidN|R=:2Eae""[zmd> w2=?JhiM \pR+e^!F L(q0$h`#?ӏqoBGu~?б~qX<Ͱc=V3nxU:W>k+G8vɹLk-W4X%EZChw;,R:,'@6^{MUXC֜jYg@yN$'-R=ZbT21C7^P!8-"RՉ"Ф.yg9v+Ya۽ugϰgj{8_!eP~3`NdFZKQ!7s$IT%pꩧjznmUa53hn:7AS ZE[s‰` lѰסV#C<}4*(.z+_v6z)d`էSYS) Fb9&u 6ĹPiS)hi{l[vm'vwԛ>eOϺ}X=jZݾ谙;tro\J]J{O|~a"-G;]tB$T+U>ԪIhe2 agkLes*Ii2]lɆ`DI442&8W ӌYxws槍<^ҀL8;'߄;D"dg7A ʏZ W(q'O_"F^SʜjK BB(U͖08Sg[ d_؛Jɺ+ %'R-=vln`v;15Cdd+Dg^VPPևRMe\Tޑ%c[!E!#N \D2H ci}Db so<L W{0n ""wx[Շ `@Q^lVP=܅dۮ :oE},Kևj:j69&CeJ%[,qj܆n>8wظVGl4mu%"aqwxx(FhiFp|!*xWK†e`ڱ)xH@Xb/l6/G5,? m&;l~|ݏ,T_*UTTe뱜Iq(ΌE؜jHRϡT9\]M-.E j:>:qվPHCJ-zGRh+U8do${np:_VYU|pP,Q!d &ee"WYtu\ 苫FnzP_IZSZIVWY'lkZBA,B/Eeiy_&M"zc[!e\vUIu)eVRUiUGThaNDw1x- &ccyɓ*pVct$M&CV2{-=! kH%kI=S ZY }Jc}0f}bQj}Ԣ\9癆=!e-Q<=8̋tҲ_-Gn{O1i.s?1f l^|jXr.;L<^<%ZMіx)!N{k"lVl<+~<QCj[}{B$+%&s.n>Âw]r]Ӣ27q%hRe?IdJXWߖƤl#??;!W_75dq>i)|ŇdJuʜg~T,|tasaZ_&*$Pɏ|q 7ѽ#^?Q0z,o`L; UlxrxZŘG&|R^Kz'>OD}49?|uOgy.`=0)c>[dK|r2D$sщl\kmDȏUlo%KzBJow|VF+@om$1[-=Lm *(/ x &dVX*KeP(YFP%uR9`L*[XJXT!j B~`88w|?!OǿmEԛo`9x6ÇLl$jw )5$grr@,C5#d0&aHlʱ0c6PGpYrϒrE^}[jMi (y3՘Ub']-N춺7+ǟC9:!ZjNӸƬ }tvlU P"GTa)G U7=qՔFׂEa\]`q ;*Ķh1R֓A=ٻ@P1*W/ԉ) Vƒ5dr)<Gg[AL%wx݅0OHeyӾrXb[XɈc~)x(K߯^,/S^ |foN+{>=tZ%OZVUT(좓(X\<{}rA_|$ p0“4,ıu<*o?wp"/W+c~:lXż^~Z|}U1G\`uVGG2t:-%"oy36νyqtv|:=A~뿽vAdĭp]KtW/R.MK9+Cp1. [J_.2+SGRQ#^;9`8x&7 xzZ<"&?6m$ Fpp>J$$FvJvEA8 ܗŹb}3+,na]˗Z۱9pOP՝Zu/cW U-Z#@Q]H.yaz SqiϋXKӐk牗)od#^1}D%HvեԮ*MYF2ѳQ"P2$ qȹe$z[|)1VԾk.Lp}-_|C`˯+q9Y%!]U#{[e"VLJ^P&Z r.&B6ƌX4mǬ:Q1ZscIk!c5{pF=/_ER)SPhb06t asBT&BI1V* eʷ`s1hXV!lM`F'/+sZ U A(L?aHꔅ.$I+۪\k(c>o+Hm(fԬ*%V62JZpN(n\UkI ɓXMDН@(x qu8~ RU I>i=BK-C ;DžȐ/&r, ֐a*}" A1@ WXteI8  !6xo{ La"V GhlOXxR),"@&%X/`tT-;uT|T*i^(;80QܞԁAW&` (,yT*8ՆmUIOXbGYE!:Vh 4xÜkKe${P2[2fPLOX\R*[ {JUg%TJh*1+M,8A2٠&!L( F/p&R!utjXF`B]^iHAwb 0.`S^ ! )/6 1 #/ d I\ئ䛃C$;8% ~a5.e `IJE7ZR2AA%XΨ\zETU<ʲo-{@:b}!S.()GV*EW&[+\/ti՘ $ql`yE X_U@"j(B(%6*iwqF_i8A?!\. nc,ҌV#+~zzf$[3%ˋ,VGU@7۝@VAr|Z}w]@j\UL>L8ɮCNpLj,(t_t=%ѫ$RDN+We2O.8 eX0FZ />"$/eLěq{dU|X:7)@JTU U;+Qqn[&Kf'Om 7k5G#O*u>ei%0h6Fx4= RV/tvpC$*r%\H{ۃz=) e@{{(%'6b5I(a:AL>^|ٕXC;)f ܓ:i등<+  B2%upqAk6Fu& 3%)Do$dbzӁ{c,0E %#z26Љ6ԗl=~ L=JZ!gѝ0Mh%r$-{ jPjJ1="*^ ^{42In$,B l`_}tGK0ACb)hXq]wprַMK/g3*nt::vLGs nc a0aؾO4mGEŨmEƬh5 U#yHY}thƤg={4l@ŒG&жĐ `)GЍJevzI,c:#S :( ,CBz DzeC1ւM [] [WH'M2PziN"g0Eʟt׻ //V0 .MLCѢ `6 Ed$dӻzCTXD t أt!h#`^'p `NlUƚ^1s0@ZpIPAR>g/Itu20r&jκuG;Dv W>(Z_i"mb-)Eѫ`mûIL:JFP ̀|G=ʨh_H ЫtBD;9>X2'Iu{M곱Ĕ5bUͰ\ _9ed$,`*匛HbC(:)J-0 Yey[GnghwEC@LCDHzTQn kҾ-MĬMw.gM" ż4167 mt\Tŗs8,/ן壚_Y=M2qEDxurq< ֖XO?毾 ½ xY۷Y?zNmerQ%AE'8+sJmj Κ6{D`Anʄ܌,LЋ1&[:u51F9_4It}))1*ېk鼱1ECBC .@Coi?I +ϟ`ϗ T mf=yF_R`qlT h̃u |ikmsl=:VP hy܂Y ժ]-T@ǐ >v%.4E<i<]Zh}-7I=/3?[у/?Q/o2G[TϰpӖqh+? m]Ѻm]^8 0TT ] )mVD5,A>PB}9Mɱ5޳94.:1._Dw,rV 0\kߡ{u9`t~{eIZx~iNLr .@}eh5`V>D1`z @v@@W<=3}z"VW/nVYbGd; s3YV -w҇BąOJ&ǫ;N:*0VUS]|ߐAԖ/Rɸ0@}fCy|A<<f9f gNuƇ *.yEΰ"UYƊtBʮ띰6J:xބs2s>Øֳ<D{$m$)|2tGiކ'zI-'֙Jx #>|> =fwY?4cmIsfGM[^ N+%7GsΛ<-M8_?yH|߯yMY4ZOIקte?4Z|}ӟ%O6O_?oknևOdzR2vX9dz+pՊpuI<_w5<68{.Æ4q޶v+e}~#hS[HHil#̮Rie,-xTKZ|٘a#qyi؛:b,z(H|U \U`߬m6s% |1o_fM̿^VnV'FonT`nX}` aËfem׵1t1;.+dSlHFSHTDVك\֓tƓz@v5&Ba νE5 Flh/d)#{ [=ʸYHU},9;|tg%kyW 7cז MP53֨M2tKImgl]5Y?`X3t&`? 5P0^mi<`_6>XW_eYZtu]Ů}xl 啙pv'-,;\XO^X~f}-Tao.m~16R,;+z*hXD e+lJSL }(2!̜\w5[,|YUBň;c+U+@@WD-Fw&X)TX`{'ò-xXsLs﫮`jjUN*Ijya8I՘jD{@s ϵ/k6fMy6.Ey{vxwv, |w̗#h43K;x}_}Ŷ!dvgFZ[T DS zhtp}-D!8`ttYt>J;0} m.L*2EA WE&3fd^H詟 p-w[O?~  >b:$#q[{C?|ԝ5?~Up6ڷFI&QSkrIRhf A8Ylu$,Jv:s6)QF#<EH&s˅ͳ3z!Olz4m[wK/ΏV9k7/u*t{ AW:-ZvVE,:I+3w;U=I;鳮aZ%/DTkl>COkw\,ףv}WߢMMBxfKxghfohz·ozqz>^٥[״%.UH {x;Zvuѝ ndgio w޾}Qۛon|t/ݹno}uo`;Ѷ-ٵvˉmz)k}mO],i5_'_@/17;aD`l>?7Q޿50urLm1oFlk 12S6~x :lΏy=_쒯n{u]|tWGʖ~{_P6*[zSj]{oG*gw7Rۀqp -],;D"$e9w %QmOXg]SzЄ[ \B⯕ q[!=s"fKQ g8n6@4z sNjEwZ٨9-gw-39`TTHd}]HP҈<(P ,cࢰR9CD "d^nx(垧(u{LJ3Czr\Jڣ8,L>!EH\D @Z INΔ-21UPKg_#g2.6#~ߕ_9jt O$y;ewe;=5ң_r1δUVzHiHj]W4 Y֎FmsЦVjңe7UrNN5^|5=3dfus({{[h+q74%htkc;}&UͺnU6%*TVBW9Ԑ.x #^闛5^-%ER@-U-hvW%w| /p/?VK/ ʍ*wu-.*ʤ9.i-D+ayN1>ϑRȼ\!ǗDJE-oiƒgp5IG0&wCJw@UQ}BBM*(C!䵄cƠSbPER)Вb42#ad,bmX}*Zxq5㞀-K?}cq/ox<拯ؠ4J Bb9̆%DD-q.i24&eQ'ć=12lr^ ԙ0 mGIL(J>&AFb0"#aD:˴<$_g1.۴bkҀ .Fp <9J I-Z>h)g#Sd.pqx(xX;vCZnx+p8/7>S3)؉F[$5W 'ibZPk%9M=/$PTBwo!BUS=᰻?ier5\RZo#HDBRLC9)2ሕeBcSj&lgpwxᬶq. ee) blBӃH\NI7"3%#_㖂+?^|_v7OqO|i;E2y-F鬆L<)oF L8LV RbZ)k LĈN$@&.s.XKf"rKעe'ʏc5u+&\y# $h*Wʞ xI+V"Hu5tWKB k;fBhE:NBs3~y^GT12QB9aG#;s7t4_ űkOH9+y'$:sO@RAՊ d?4nIS ;]v^8^_kjrw#]8k&fmu{i*=gWZޞkguymsM.1lqnMxtr\[nYm墩pNV;iq&%{grL}Ӱi4&CA q2m0j:k00?u =st>Zpv7ku}ﱓH=YS*r >G.wp 3BrdySd?w޼W~x߽+? ,%DzIO&ק`@OMyqnj6SO.6󚒏3]vW{k#@~$|?;clN=#]QV2\OB (˂y!ֺ?,B3zE? h!vS 5Pkq#HGIb7n'$F$8b"pD8ZEٽI>w.|pOXz vgw8tW7!:<=ܰ\Zh@DGFl0nhTQ*"wxkZ涨\ :Eu q1)I*[ &jx Y{˳ABw't4 ݙQ xPP lQoyx Lώ|A<39I<^G.T<2 3C0({xVh:@#wh:)lB%H{+DXlQf{ED:7`BNL!&ބ(t4qiu:h.]9!0%E;!&ys&|d4Fs*4C Q T%)` ²ať_}<{9l_P *=ġ8zA7 5ʉ1($Gp^;ܠQrMiD{1]kLb rhwm34q grim[0'\⑟:v4.PB֋ŸD@"'6 Ef<& |ւ܆[KhA=. ܿ-.`wEt E0Q@:6[Olp4ϗ\l]9 WkructPG´a{Sn㍟g˕8LRpZk-gV0ߢaP R;n;';wp ,DQeL 88Dv$Or1.J}Znz!<}ʽη_`OY;7PsvG0 ^8pSN!YNaԤnL+a <%ZEAxo} ޵q%ٿBN0i> xx?n6O[,i(ɱ36hؔa$6_]TթշMWư5Y`rhmsުtVC}D .7B; 1&wg>[B`DhHe޸`Mcdcf;g/5'ڕZo|n5oܻm A&IH P.ې/2'I…“mˬtowݨz]hkЕqNÙv/g_+C89$)g&(9LPXV:a$} 9)Qࢷi${G[z(=1@yFݢ=olF>fb۫!]*yk"o&EVՃġً{4GGS$.)3nLҡX$YRBTT<Ό1!:eYBybqY@ڥ^>DjB6.m#.~ۧ>ߝnu&;z"vD$y`Bn@H)CCipTz͗90x%߾uh> mv%~z:޳d{`ZOĴsnP'Wa?^3nA& Vߠ# k/6co>G|tTJ_rF9EXJZTzcsՂI)ynmR\H|ĊRׅ&RQwXD*dp0qo>-G]|牋Ks%yq~Y/V'y"cf߂;x,2k-\rhD،ߦ!eyCSCJ\#C! 7p wKGV#l%1ʜBq|=, +/$x'Ge!qk=zO,OFO^?Ӥ-6v>󱔬V~g_Gz/=LV|:E++ 0ٜuMqLYW̋pMzv[7w=f;o٣Z]LJUj!kWѳvNJ>z}ur:ҟpin}vu(tY.OUm%T\=9k kަ {0˝/ R = 8V+oGO..i!=uf6?_@lԲ};?kMMT\].'N}$_O& 8gRg'U\zI9M2>O#U2Yy9"sMP5E(@vʲ'|e Yauql),3W8:!IߞYZ۵l 3#9A+}d^ghesR|B-ge|o8+)UNUB}K.gU=I9`?f##k'Fif98lZM{='fo8 vaY[Y:_nnSkX:4mI}lf+0!mz]C۬i#sJִQY$;Ub:}mOL =k'7U^R~Xba<%@omL=y4 z)񛳶fęZOkhe- oW$AMNѳlY0oN~_hw/OZ|<^rw5 .!U_}3miEk|n4%8rqe#M^V_>oF2SnsnWuF_۪Fg}جLa }$h(A5b[ڎݵF~+ի/lM1Yy]=߮Fӷ{H|t~jԽg*?ɷ']=-Y߄K(s@ _@Zcr>nɗz@UuJ2NZn[ BPћ}7Bce5!MŸ2<ѫ Xp՛Y.8쒿So=ڄhL|fCȕĶ!}@hP+\A-jw/;[龪X!>s>Jÿׇ+|>2Z]j'Sx,P|?3[f>k2=01d!#ñȈ%R$"3\F\c滂p:@/%0 mUTOvrzgRnJs;o*of;vc{ۮmmw̿z{?uZBc\EuHwN8?j?wg҅!,sYj yw A=pIx=gr`љ߭f;D (nb#_M'3Nlڴs/'[_gyzO QUe;Y^1pp.PRug>_XCt鹓*aX$Y9INrvSS0xRQKCF 7swwVXE6cٷ}Vw{3s~ߛ.nx^"XݗFػꋅdvjeϾRK_&HZ%,]$^('QI3نݔdcm4Y|@F)/ўW~|rjѷӣB;p/ct!Q!S$LvPe<7(*Jg"vrzn|5Sb97 (]SN_adXOYD96[>Oﰸ?m.NjGl]~¿ 2nƹ b4 /|16}I:%'ԅ%}K\ E*V+\LXCJkr92,w9'XIa lj2XFrpY%x'SA tPIȵ K(SRPEe̕ `R*g#2'+'- {`iI+83Xh,*jE))HeJ 3m`&GbsOԱd -]x=i&IHI$ /xBa>K^#ֆahx>bMOFcZE fsqWD 6aˆg3h|{#QID @y@t+mC uĘHg])^a9"s1Jq9J fE:ˊrEDXu!at\h;>K6;qu$~k! %2%CiqjHaY,RԂ%ITLJuB9Åɀ$u Ab1JK,oyItf;!is"}@j3^R@vd@%x1)# C2]0A*Ҏ)L@na xLq$2h?RT݌*AI3( ,(y 9)<`USU%.fd c!mVa#uhPT[Ί$wEɐ( V:#.l#FU`$A)곛C S*!zW>EVT B/^%,~AY3zB܂VŢd2H儈(B2":LHp3hmInHa娠($T_ǵ"b&@EYx$Na Vev LI;:D͠j`ol!!(B0`(SP|,B C>C _),|qD`P|YAA.&R& vRƣJAd*6d& /NJN 7%QBA1Fp7GRFU `hP3(Ez@_PS (Ht@*6ХAYHIDPyIIU2lcFd~)Fr;)pPBI1YV 퀁h! 'l69^GT-6gc&236ʭAVs-%Y!$j"lX9 !s#%'\xa>.t-\iY-8gE|dm݅HP6:D-xg25BX8Rw`̰LuW[(Pu*KjU@cp1 %Hv9t /TXКP<4D(2!/C`$ (gf:-,h8P{2  ƒZ`2V#FP܂̠m@MKw dA2X?`y";`~X&aیj2Hr$; HV8u{ML,d+cCX,[4 1;Kn|p{ \ЋA NB-e\ !JE5u1 EBA Qv3Vc,, 9H*vJڬSrF,JZH V%@ "uS<2M,tQa6c;EP nVΨ0"mܶA@#k)NoQ2)ʒc]Ξ9̜ofP'0,:QLt< @-[Ho 2"eHray;( 6!tϟWk@[GX5πAVZ6Mx٦ҎRG1/ k369Z:OgG\t h>mK-+_r6-dz.٨oy;/NGuJے>ϛ ?w̛q[ d/zU̶BĻ# ǝ~VL <&'Z?y&1@&P,@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL/ 0;@u^<&PW>3Z>u&Phb}L  @"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL/ <\=33b).aAZ<RJCL/ dB 1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@_.ω dR0`aZ@KdYm!&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL ba}PZo0n-ī櫟'Դu_^7^{ǜlQwd~]\' h!]{>|f')Lu::3iB%OA_y1f}eI~~Uo}եٮT* dgqo,V̡:-}e?rZ"s$d<, RuI~I堽 Ǿ+TlNZ<"5(oTQ\늒DpϞGRGuJR S"_ ?^˥0oT/WX X}&gmnGsƒ tWJkjzIW'_ i~Y3 ^L'?7~۹J_F֯v奫" Zm-o|]ӃY]#<ĎۛgQCjP\7jM댮 dMYV&['s zt%"b" #Y`ퟟm*9uȝs, +}Ф\^E`ig]Y7)*. [|R:)wiK3yeᆜ:k]F\.ZoxC&0 \|.t{GaS9ZGf<W٘5w]Kާ BG6܌d;?qbgOS0SGf#22x-QCJ'R$;V-E̒y gk0bi]C lK=0&-mKeOlc92'Z\avǢ~4Z.]׾·߮#vX4hI,bZ?o/6Y ʊ{xVг]l~=/TTiUn\7=Wa}jF(⸱b2=W_̾|z<_G Bt]BjYw)kh,< WT%I++ʠn݌vs;LZ :Vo•wFX.l^^^B,dz<pg5mnUP qM_vUA1W#V+fYz{Rೝ HБs;?2ڣ`@#أv0kގXKu8j~ݪHoڝ`Pf/=4@vE]&6Q>? N 8>>?~q4sW#[D9mtrS~16?OLo';&ހx9lOܸݾÅ՝wZFgkYkژZ-hk#^\[BZ:i'Z?FV'x:UJ@,hYD#,&ħ|l S 3 Ѻ\IdDQuD*y(pY:>848O>]BOi%";_og͓=u٬Oe 0(曶A>}q=}vcn۞$r3l={NOBZujU)!PȨYZ{;~wȧ= ;2k5!AЈv#^KTs LYFwMTZN.7Rkӧg.ftdBogz5gs])wuVi;x>{l (S= &3dꝫ'Rs-x*O6?~yˮso`OwZz:lV-tfCqr6]t ʼn tℎ)K=rZhRp4o$\=|2ORoNj`J6@K)0s)̎,wp9̦"`RdR mJ.q< 7z?o̟ڦOgi 4mlKVsBk'|= í ֕;ƹ1cZv!Np}0qQq+k OAj&>>#"`l<RpV;ctE{e,o/# 戫y_emIS8 ~vܴ94נдykNgLOWu3;͛ gRwϿf?7i96_+h/}͋__TI^h^Z}ج I/G_ {ԽaNhRFUυX\jVך@ʳFGN??I H6k#2oٛo>SH'1WA3@ 3c977>|pfqo?afl8_w4,]oڽjz@Pp6m-sRe|4N d'}uzksq!cZ?0~0#99c.~za&+͖X4ܝm)ւPQ}IJ 7oSGZ%U1XߋdaU__~ۤe%uY ޿ ;PdhJ+x$:f\*7jRQ:VX=5C3걾QHj#pӳTM57O~/gn_>Wb=FW,\?1}$eȮ;me]Q+Wܱh#7VW']"O 'WֵG _xEFd5&e>g.KRu5t ֳ|!Is%5&eC)m+pKR ŌNex'7Eĺt夤Aڡ,q\-<|CChׄӊ>MZ33\|*Ze >b!'fʼn|щw>8}=-)lWޕqdҘff-}HAku8HȋKlmwGXW{{wu0\Sh4&-SxIk Sh )4JQΙN&([#=Z]\نyom=uA&[hz!2I ^JTR(Kk)gJY Sq;w\DuBs4l qfJPRoc,D#JyIqm*&<me7rrCR# >f#v ק g}Ld톟xdwG1b~ 1|1W0]nyHu P?5U7e[[^O~\ FyQ4OqLFixo&9kn?Ŷc rY|qVbדAlkIB%yLyE XaJ!Q]z!)%\Xٞz.64Hhz"V`4hɭ`u'8p_\OI]v77Z6 t7yB>^޿(M+Gy9U"T Q$Mf`x9sj l)2y>J:28l@}2j͘ ;$Hץ:g}t<gLxW/|.ST̟#X)?Jk"m"\_"PIɲYE c/9vx$EёʘmY()ɔـP QO ";n>(!j%WAy¬Dy%YLu;;ΞcrX|FxІhºS<ڕ86o%e=:\芷7ׁ̓3޿공.>l ;ɝS;ܫks9B=9.uܺ=nhzO6%ͪClYb˲>wk{l|絑fWZ>Yky[\>g%W鸖]ڢ#X'l4%d] s)ew|/?²*3暐#e=ޑWH(Sk/:xU阖Rh)ZWw7]gHjL0319 J+H4ׄ2N2)^NyɹlOz1lե@aiwyn2YW|'Gl\k@)"1EBJO8 Z)T-N##䳼c>L1]AꢚtxƓeYO+9' %gMHn &FJI$+2IQ =Ra>Fmc/s|To:F$uV[ w a8/FfbI+imT=wF'鄈"uhB.CCMAkn`$;K]0Lu] gK#dCl"$HA]~8kCYW`:KWzE24 ])l>EySxCxчX)ڞx2 >44h#> ;*&Ipsd@ O^n䘍Ԙ`3`eUzetGᄌ>pJF *:Mgw/h_uVֳX}[7j26KS}ԧݠu,K4 ǫτW-VvWB59;<1N%ҁQyMnGAEt塚NP~[U~n=L3Lba?!1XtdIڌbE9LxXSI\&o9Lиq{c \x-Mђ(:FMg4 n~:Z^Z/[lO*XgHE<fXJ(,RGa@b,a@嵽Y&P |)mA'm( S %R쁂CBe-C:P%N1 U<7u5!")L43* QCLg3jPJ(krv甇37•ޞMg`hY(NJxXD>]*ZL7 y]T1*0"\ك.G(b+ Ջwp[\dK4U˥VPB6g; {!L6&a,67xW\&bmMubͫ篾sN97?Bg܁Z =7 ?_Дw54 ͍gh g\[ƽK>.Oz%n7ӐYoQ.n͹čb഑̯*i9[:BF|VH},fbB bÀ@GjÍz&$'a!_SD(('4JDTk(Jj'hj'W84,)*NA>nXiD$ଡYaȩA=6MȜޮӉ];;9#.=b{uĽvFVEg-wyMR8;k`AGN< 2WKh,E4J.ڇx>'uBEUwE8'DLB\[hK& i68xc+e)D{ :`;'7xNG`{e w6HVpĒE5BJel A="Kp* C DJRhB8I1eszy Dsں>M!ƉGr `"e+ ƨ:ybK{ )&K*!xT3*:; BqKC=+I1nީby)q.FŃ0m0C$&Hlr=L]$ZĀXZ-EaAN "(" :.g\}9jٖXȱwz84ᣱP߂6*b"A,j0 꺼^+mWBm6elf+3ÏK3lŏhZd6”ŏÎw-`@jQ[y:}R=b1s4HR^l0waLRp٥!ʔ/=gT0RATJ>yȀGN(=eBBƒc( ڛĉ HNѸH O1+)ə`gvv`FXYbZݶIHb)v*~EqιbX:޾y%TfߠDTQl-:0ΆRsaʳp!:0tD\ eJ)pLTP9Ep{CqfN%#(s PRj"#DL!L(! 3F+Bs;8Qn!g8w}hd踆|_NӶv{JPO0iPjJhaGuņr|.pP6Y.qO0%wJ-U\힩^z[{\Qra"HSN!)@qc}JXxFbGNd4BQD5&PX:g4J@lAHѮݶ;#gG*{D>!4=Ku.$hsY2 MRrEraSDjǂqYƃS<"ݺk+Csf:(ߚO!ZGPHeiA:RD3r jj4Wouyr^:w.`{k,V-kX{_r MX=ƢtTxNK ̩@^בnHR T>ZĀA#kB2N[Aђk95cwX3]g|5]{].\R6x w>LXT~>O~Ӡh^Xc{ؠ0 "ЁXNHIhDD-qm/f &DiP?l@ sT^rLAQTBێh(7|L+ݱMawh^'D"rV(qM>4e=G%(/GKrŬ[e"gKP/VB鬳3.y^[I{1rI4x9'B CR|gd9#)VPO/^z؍^ܛT]qc!tqTX  cwQ)KTkU("BiVbjYC~9W2'21\YkdilHe`SsueZ(3MꢛYԢK9Q5'rmʣva[I{V<)_HI@"Zǣ#26.s˱BeSA_oW=WF];~Pb8K @D!*\SRJF)k` O3hD eq\qOU_ySZZ02Lݴ[VWN}ǻFltavn'?c~kUTV8]C{Tw䏧ZP"UAw fTGޖ/]vuWT–`.օJV&&)wf{0G}ThÕ ɁqV9+ Jh(Yzm2^aMj\L("8xr+ $K]tE{^ޣX:HW崽*XϽ*+7g> պ+|/'o/MK bkM$3I>*jmMrњ;Jh9V],qj= {WUQWx}wJBfNsm P 0e*}dELT$`9OSIol Rq@ 1jMZTbRRЀ0 1٦u Z' ¼H& "a+)0R;f H|sW|Gy |}&[";9[eppP8|σWTUQenǓi῾)\8)0m0)QxX{}~hW9̔gC]U9~: jqŝPH{K:^th;8a67}Ivt#%Cjj}< 3uCr%RT@1ʡY|փ:k^hgͫۄPԁ :a*& WMu/%-L7 Rk>ft5(o2C8˳1]Z ] EKΖ777O azg[O0E 2GfQ~mj8 FnThTnv)тxF7^ȔgTb|:A B@u3n8%lfmss",O GXܺ)/ bk̯'o;IIk~킁zHlKY=)^NJVhqrRe'@!KFRʜq8U,,h?Ѿ5{I@3gKPUdnEB5-~i`ْVN/keSZR"OYZ_~V;z:Xg?fU|^d}m4^ u.!@LF <\Z[Z{i|8~R'#s1*>S=J !ܣ8q>̈́ 34x΂F刐€WhSNYrX39U\$B!eHj9V:WB"DL3rv<؜9lf~<ĝh:^/t`E&Н.}(iRn%w/XmT! R'cd\)tRD+5$&(PL)dDr-U 62nu4[mMr\)ZPR%(Thv3rK8w dzd3,}[=lS4eJ%eZ>w6sS-]-co9XK`%8yq5w4oVÜn-S_Is4Uwo?5)J_R\/+ָ;;H>P;B )4sW5NLCePUgfG̷~6;#SGWӳ 2Xn]8jQG<`l%4'1ZIT<eryKl]?.֛W[:]|v2ͳYw-KlքٽMIϷwHطC+-7C>nZ_t1wE[:^ܞO?c{׻dvm]۔wۼi!kןJ? |@A4.<_kɏxg.H4^%TR8!eЈ߭γA5Z 9Jf`9qP-&S(_+[d:b.E{<M)[@cii¢A86Űi 3dPf{RBy0L$Y_GR U+))JTȭ+TRY4BW g) `^,$4:tLG!qYA9Tt@ĤH˲pLPQɨDhvg4Yw9;$wa 9ZB$w8 byzq5^3uĞO&_ZjL"-51n!BTLΫVK]Jk^-AEUxQ<ٻ6$n~w|penbBW?$)RKJqW=R/DQ͈'Fl3驪Uu=.NzzUϠy8[(d`&HnLϖ zxJ*j D2橢y \aRҠ6)WϬ9^;渍 ͽN''}N2J˄Fdl ssHW R_M* }+7En?j"Qe7Ƈ \lOd%TCn;]ξp5ΤF) 'e,"whR>(Hc`W#4ǫ]"ҫ#/I(uKDeҎN4N Jh2~GIt }FVσ!զ!cG42_gֈ ioti#5{f7P;j-?~YwEtb6T0ڔ&ac]HX־a宋*Fi&z71ޠyXnzVpEJ[/r^k2qٽt}EDD'iK;dKJ\$~vv{q1'2/UӉCM{l~x(V?[n_}雳4+=QZk._S``y@*:,Qz{Gz8f!iZ/;n.]DFhcQڅYm#gP^iȭ}}-z݅Q7 gW*dɞ)I MKZ)of]==.Kl%\}p{ C>4GźPڙKtAy7N/ۄ?7Bj?f9V }93\N=Ӵ۳0 {&3\>*pʙ1{~[ݸgܝUImEz0'n/Z%tyQ0O[fx4u\s"jEϮ~nV;gxƳMiB$/Vp0'E/cSLDCd8 k0Y\B&n黃~qXF}yF*|gd`_%=6l6N،Rh!1J(@9'nBe66AUQzm|,_-xㆣm(/X~n:;op*|aw/Z:jyf-CdX' yϔgm2zꥐ*[&iex5˓]WP5'9Q1'&qkPkh ͗WYGg2.I@H+1}2j)H`&ʨw6BfP:{yYlR$Zh2+@}xJEȼ: ١Y28t tdʐ)+jl 2~HQ0((jS"pf,KʙRfBe]56Φy(Zw쪵ֆAk^WD%d 0:( Q&T%:z_U44Wd!2䊬 3e F$p$Exd0Qt9֨_x0]e8h{;"bv3m.\j9X zfEّD0͉-B$I6NZI!@% d_@PVzU(n5b5rֈOj$X jzqd#5GrJA! zqzPa5U*vL}_T~TL1G?Dt׆(< tO`mN'ۜgw *Bwo!BMS=᰻?A@A֖^kOD蠲vcF9\Pȼ{ &S*U(TNIzRdbl :c6FΖQkmz0kSTqhhK_.硯 v7OqOri; YK#KtCncjfMp0 QJ6H+"c1 FP)|lCx>hrQnmt_kK/K& kAYV-pUtp`9D~$̞"d>\ {< E{ K֭2"d)-DzO!%%& YXY#£2̊|Xִ ωA 26Fs;.Y= \H"}":- 9YΔѷ#jҺZ ߧYjEY24B-]r;{F3\ci(;bY:/ǓؕrZ_p.FmќxH;O)[nZb{*|հ\2_UyEl&MVTФilwﳠ}81P2Lp%G) G~Hg8t %xv&鮭ÒWY982c }*if!m&+nO pܺ-'mq4hks~:?|)mGP>x~J߬~k5ׯ._]wQ.F}[%֏6ODoUʾx^_[7ߖW~h/8>x{%Fq~-oiONk+^sMhSߞzq'ijnޙ>ΦO^\9>_pzgYW7kZuQ|i%#mub?m<*WfK=FjO}9x< Sb?~wo8~c.oh.R(50zIO&oO#=;S۟ZS|%m65[mu5g޻8V|?7xJާc{vXd *zGq"IG膸վ`II5U,*C*u=0FnGIj Wn'$E撕ܨdGC|E9N{| g;V*i!:<܈RZHD;Ŗ(9Ia,u $nt)pS\*V㉝5ObO΁Ȯ[]by6!tyBwV'tgE _]ē! F :j1cxky0A<;U:[R{ ^&.l]j9mq{xt90D2eQkp@|O,AVg@eYb3=L FR,)h͘1,!l kU#2$Vڀ!X&%ɨJ?!zM2&ɴTBAX!,g 4볜PQdHeB<)Jn@aԈJLz>UY_;fh[Fs䘣Sz)qOxn9(w!1e"qm. \ lHQY << |n'܎Q%JdjFpɤd: (cP,4rK#'}yiݡEXqrp]m;5ݿ 2獗jͯ'Y+I:fQLX)9(U Jt8DU:e=7$B"&QB3EȒAr)3 XJ"ςƈ¤ uk~I'g=]ɧ6rj6es)?>)kw߻2B|\61ؠ!m* ,3]nǦQ>hR  [Ex;%Ex=M{Ӎ- ʠ9sްt6y&Fw|_MhW 9Ml6m>tFٌ=o lNq2 ;6X4>+7?TkN>ŧYZoCCxOu5݋_s %PFc!ȬR0&eN*a2D ̗' Zy١y xi3Z1͸R|5kqt(*:ad%Ё$A:2%}DtCǛ5 gmQPzby&Ɩ& mR™u : gAat1`P5z3};h*ΔV:pEA( _$σ6gV#HQpxu#-`Zx eLFI0:ƜV$8XJP^5rUER@8w65 BB=vqϗMQ=rJ$`. 7B)"jbO]koG+lHk{E~!S"LZJ,Ç%lI26`KcNJS6b ̑=]&0!t[MsN<*Fzdk-(e͐ Cֶ#I!2^NUrYDzZH,$ȉD5u$imT!&%mI2gee=֝5lH#tyWΑ:/d+ ӑ!`9SIGmA `JH*T5U"3^Y-20."F%a K pN3cVDBJtAZ"ˈ m^h Xǘ/)J)6d ZwvD]ro\rHd=x|/S"d4JE:YAf$L 7Lua9?gg"1a h ~mO9C7gȆ}LgK/ P6E(%芒VNHWt0锋4< &R=swj${6/ZL|c;/ _}مHO\7۠3)́)O)@﬎%|}Bg(rWi؟&ـ[i^YMײ[:۵kڮ/1*Z{uUA&$);]lQy<&Mfbϣƞ!&zC:4>Kֈ1y{OOn\j#|e&hm}Ƿkyd_+oýuF׽A!{ -:g9hGyE<@+̃gWFK|@b64%ʮm vn®mW+t'xTWP]ێJunB]ۮmW+t vnB/߾Qb߶9j}gt:CUS-X/z!?K 4Zj SQ'mHs&Ne4`W(5ZN-q:mm u\rv}W<ٳi[vk{&'dX]pqJ-ܠ L4~]v.К`7GHdUx{ Bva tawjx5W{f ZKAD,(\Y Ҳog*ٺu)fa۔.} {huD4YI|!֝56 ;<ܦD]9(rcr(=*1ÜځZ{]Ί)CX^I'!oD_Y" 6$FO19xHII(*.zT6(xb|rf  F +Gg.V9dIeAeVIM#l/iyP,42>b(^Aq:mnjlrHmj']՝ۏqԮ&ڦC+<@T*PVuFˤ\1.e0Q!$B k8]tucl3*7ȁ,dbȊ8S1D&("Q12vVx;cSDv!u Bo36pe)eKJn\%!' pu%HS6N۬}D "qK*)ɾ ,iZ:!ʈX;k.$z՞pq̔k|դdC\Vu}ڒýܫ|R:7 J"G,EtA͍a5ye<ܰb#@آ_']#mΙ%9amAp]3E?PbE_KEe< `qɨ5d ͂ʞ,$p=nHڡ@U)$;Ԋp͂L \-OX 9]}0@FY E$Krú[A#p˥ą`Q qKI$ GmyڴZw֌R\D$.pe'ВB3?Mbv~aNNC$Kz Zt\+a*XgHP{I&H:H-"c1QYPG( %/&+&--7yAh(smnKk͹'G)@ÙvsT["=N4-ܷ\m[Yd%&KAMA3YhY#f󠲎gRаZQ$z91kcm pZ:Cd&(y$BSc1 ˞wYg;IRې"O9L>}P%nǢ@\ӵY!7OzRo}&M ܻWIo[[v|Iz/-7e=,q韽ۮ4nݭ'|)Z>R!c8eK&VRrxNr*Ri\y1;ߏ\LqVĹ%]\ $Y}3m&g5\~mֺ?4bh?V+*5Mz?ivwoz{qѯ镏텷Kbqy?>έWs !Qp1i;q 0%+[U͈eaDpb`ŲD' ؕ vAjuX5]:.4ב0}t^~^}MpPE2^蟝8?}x立GGo?|:ۻOf`BɁ u~ڮvj~Xi.k5M͛iZlivo+Io ~XI׷?%jrz͖U~!]_nNǽ0{D۫i5^B!XWږf_T݈Fz0Cj; v6#᫰Jp`=IK &В^#>>q-:d.vɱNzrV{~EÎj/7V< 8kxe &$=ݚ` v,ש3N b%1ᵴ51V26Zsc%ZՅEr٫ij@.gw<YTғ)5$$Di1{j1!gLυUU`s&<5O1Gmlތ)M 4fwf FF4&)mF(EQIAJfeeEdhZ &I,`i\ȳ`i(:|'&DN`&F(Fd(i!S2%傩DU ᑚ1b螞d:9ADy/za.QL~Vw뮲ׂnf^rި WH@C aN䉢RY}HA( שۧ|Vf)͙DFLPAJ4LLN1Z+xڀ}̜cSAW8J/ֳWͺ?:w\wߵYv}!ەj"j|7K\8_u=PpܢٍW imZC[nOe_TlnבXAYi0nF hMnVwݧߎ+Zz{Uq&Y[EuSb:rp*#q[Z .7;˿Z%8XrT"a"X2`SV:Dp )Ww&ݒ䕓S鍩CypQ}7a^Ȅ(R܄AE6Ukj/Sgk`JB}}sK#|Z7'1N DĠ Et4DrB )0 )(lԷZ}*2h!X4R!/)ͩ_a< =D U$ Uv5eY'QSB8ZIr}<ճ@IZ+SZ0ND ҸTSSDex69a=*|xN/{;eyԇhR!Le&7,1$hB ryQ}"Cd vŖsxte@PhOd=sf;d=<|^̓ʱGG/,᜜ #ɯҢ 8 Mh+Y):登\OC#BȨl dwicNJ,Z"1sǶ4&\r"_]jS042b]xuUs5'!6>mF w]]~n>0u8߃w㲩ןMe֎}mhʶGZ/$SmX5c45 )B-@'5K1SyWd켒=#Rd !ɝp"O^y d^@iˡ&Ec}t$fHjc MwBD Dy ԥܮ!S9zr{D@) ѨQGK/L)Y^qfM9 A2aT{DBG=6ۘN'!o~ F]Mr dJ'UQ՜ͦ~882+4e?\M]PIkQ 頶?L.q|{gOZZ?_xU5P/fA45!  >OS5o ˮ~i ;JksM>?\0qϏgKQ&\MG8<~rXNկDցSo0.C]ޥȺMF=q٬?uAv ǧluG7ϟG׋t5뽻L#kYzO0yc~z{[^߳~ChaaCkW֕IOx:{BV|CC:J1y:\nh{^f+O _]oo̍[IF՞E6.6!!TG'OyI?LI*s$‹z> #eυ zox؀*S &_ܰQ25hY Ԇ6]|x\k)j0ZUyZMt4[+q?V w3u@Yf ]+ϋkt|?]Zz);~?˟fe=bK{j)jo s2Z&|Ԥ~ ZѓK<!Aj3P_~  u\ׂQU;e-A*vQ?l۸9N  qj-1x-N^"D"dZ>j`NroRIfvKɴ$Q4 )b#Ks}~_ >w|=[#;pÇ+g ^[K!h&h9RK1āFMe%MGॵR]ᄋo׎;.CG Cz(*m .z_d!TBgn.~68 DbG-J ԂRB8/x[g )Dnp`b1ysӑyst/_pCۄoM( (HA-x mmUj rYN)vMI-yWG!))Uuîٹ>lmx8#wsF~^|~O[ӚC6Į-Tx>5!?厉8=5-?vSy׌Fzb>pg? S>;[j(m1=kf۹R>kmiq#IFñ nDR1&S\ƻů,(&3@{1^]~7?~^hXӢݯ0j%Z^)z;[8sh/γw WM?"q pxTe>8`ǣ4~L.}S +;!_Kȭy:7XPwrʅ)se—77M1Pv)Ewfޏ z\;* PSɿͤMm_tAy?N}WLRS7x<3S:#A#N H }ɴ )?!H>Zu3 ÞI0n[iZoxIm2˴cJ= M|5 ' Zk 6:CN 7A^D H9Rs~ųΗ %ՌE]nBRJ[AX 5}D-HH5~q+H/ިС[o@wcG"i B$% pJc @&\GHR鍆w{U}V Q]r͸]V: .<{YuTr*[R.rߟWy*XUq#Q_ fxWzjTHB O>Q=zbmϥk)rL+4ZԪIFĢs2Jycs)@q^ o3@#@U28 A8!xbNYH r o-}>?NPmyo\oӇ5]~n`ŏ|kf?h `=86ALW_JN)Wvˎs0!q] P jy C.gywe[k>iZqZ-|gqUB`\j˝ 4/ LܟjBRI)PSwzlp=\3P޺.H[b4Hq;$ c`0,!! ] O$G+)세Vh3[޵94x[V/r)n<>_dVGWBEd$E;/{R3tɢpQ]6XGhhqQ2p6_Ej8#fdyt!bjZzw00965o_J}?52?͚۳p<=Y/o8.Ϛi3;$:ix|L]W_5W{m9iJl\wtx>']쯾c;߱޿Yp$2Ta9= '8t>yc"aº0'/{OYf7 ;O&wZuɖ7LR]/Ffq945yuX)K)_;A>kiq|MW;a'A{MfU^[ ;'lni 0Q<pC[3PZx€\IE9~^q<_@&y. n^.NCV .%'˴.w՞{{kYɪvֵHNz- @ nufJUYIIG XZjƥrc1kֱxĂTJ<8KVg٣~u lOwc#nqQO-oN*UN͊vW,2 >ijԤ=O@JqDtF2NW@TYX5t@:Hc2[N,hY}1`awn)NJЉW"{Š[)Bdfs`FYIڡ,s=ݕt_Ph.t>43XEs sK*lMqX̜pPjWwfI`_s,s^@p$I2H#Y *1%R"b6Ny/6Tgokm;m!UnyD{FjGdz>Z!&}u`aXSU%ZxYĆ'MKDBވD2~"ZRc`tL"m\I8O Wd:}1Vf~Է"Rk.! -YQJ';R6_ds7nyM*f+ 㳶|,_~9#tZ_]AL+C̣^!̷$}y+Nƀ"ocRk62J99~N[m ܠAD"r2Le\"'N^kd)Yhih33\gtku.[-vp~jޮj*'۷g* QgM",#<)Ìv2g9x6l$ɏM.сpn&,_ZWjۍkZdyn'iԬHt־^ ƺ66^ɵy^ןjj=5+-~bV8tZ#vqN'ӮhrKܒuXo7 L ljOkKƸSV(,)u,7luKjQL_K ţ d'q~㤽zH0Ii.Q^0ٹLp)TreRe?T>oyOu88P<%x{c~Lk{Oup3uZ鿹{RP26uyUK5,毛îZ#(~oP:^uyPhI׷nCZZͤ$2 xK` q.&Bê gKe ltbX~΁UYop"1 nUtG5&׆̜u!qt^Ghy1hbT>i?6)WJ2EC T~BI  vy9lxDb3x3Vpd Bvu[q>cK4КOiEX^ߐ:N&0 ֨p X jWrVEa+ oTKI!& |^2&W%T`n LYu0qXXuƬy)ڝB4/^v۴-}$"emƬ見@FL&js`*@^S;5,VI֐Dk זoޫx-'^fs f*P=0:.0*h@%0Kk' Jj8aT >?=[PVWL"\x>WO.$\1[90E^g @ /TC!E | *Bg'Ug=k`gaaeGȌ҆Ev8).FbڮpRr0byPĤO<G]5%i\4'e oY#~נ\U ea 0@K%*\k 6 P @PX߫̌pð5E֡ 0_AO#C Vd8{rsavN^l!& cU[.u}٥`v0+AM"O"H"!2lŒ}O Pq9(0.FX70&00jza _Guōt-"~}M&VAV u\}sP Z-L ڕ B-f)lL^7GM<$׈gCAi=xJ<@/Ry""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H "z$XWωR0%C)cy6$lGA)"%@Fp"H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""@dH (aχZlH \O%z$ ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H "z$9@ ŕِ@R:A$ $pc{""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H "z9$z/ܽVk^3j")|WYgلfTҷVf{z1/z)|&mtN:)C;_j\ǍNφAy/c~иfP;/IɧQa |_skXD}=X~搷l2F[l.#1Ҝ 5Cs!e>[SmEJzU x1ȏe S|7-5>P>`j-NW6uɿ,E\|ʹPu/3X~ 0-a/yɁ&O66[o0M>Z-,s9pVdq,KDft1~!B(6Xm8w߱ =.y%Z5O_Us=J4z{8/F{5ݍI]A;wMa0w FUU+Wb!0S_㠖ȗUa֞eF/\XufmL_F:wآ,t꛺^F07^ߦwLlxxqZ߽ޭNdyS)/ %,CcA9f5kTӖ/Yu-Ƿe8?b7UoN.F#)*J`\p)/^"|9\(&p)f ۳`My6VIU{dzRI.*Z65z>-{ފuv0QaK@BJ05xe͙)XudQ!qiT&2>M[zqLs8V #&+pƚ3NTP)r_C*'Hj&txH_mq2L*-Gܸ#._-Q̮ZoY$x V+w_ViqNࢦtFj1tQ*{Xsp8ҳ DcRg6͈MƴFaq#vY‘y+t=ce[vƧg2w% xi=S|lil/i2ޥX!GQc\USzVDl~\ГC1޺Ⱦ'{6>97T~TII1D \:!q_fw}Zse+z#na{m5"Ƈk{ɢQZbGj~Z{HYRV)[ B13&̛#RHwKAwFUc昛݂ ,/ $)e/Qf.1{\K̼S0A+-dzCB0+'jҮlF=Jފs9 J3>P mVoh3hxaZ .=lDxRyp*Pj\E4L_vcc(Ңn"K Wɤ=W }Uv(I=:r=:5heRF=)/8d`ZzLOQ7aͻÛ;S;JߛZxj?xmTڶumKihY}q:)kL ܍4ݙ=0]Q3pƼ!n i*7<5m~}F[C'nvTjl/^nl?+WF?dԍeZ[=jkm&nOHiyf?:; l҆!K9opOzX+bC-hʓzOk/> s %gd IUa-( \3Gz+O7GCZ|3Dd#z\c5k}򕗲zXoJk|͓ I>i{>Qk mφaTZ|FԞgg=!-b.:VYZUkE![ LhQw!P>uX5(e_#&Yr$_U)\Z$f"1(i\!LWY^dU.^5bzԈ~؍1$L:!Ig0/vu}.{;W X461/18O6X83Jmh{5>Aш{+FT] 0f5MYEDZ.7c [wl|>m6ImO5f2,wFsv߽q_׼k,wǒ~x8H&A|;./gמ3uۍ8G)TX7 M'ݎW·t'˝T+jjC0ΑbEx,8"%,kރ1iTwb~ ܀B_ԉdf\EC3P,ny|5ӆ16RAj9(bt߄`6PTw_Ly|Y 8+o9tJtqq٦;:_c L S~xRn U;B2;zÈBɤ᧣DQv!S1%D-< r5U Ý+o.W\- ?A7oXRp8ACSЅ9P>R,_T7;Y& "o],+Ppp.UJ:]X  U5BcM-b5?~"߀c%  g4[;K/;;kB\)T^yIwE+OLdRrwV^6$VK_x1`9 v )qYUө ̂6,0y@*mQT9Ot5p!XB)P> 1xl9l=tϲ;[_qsH 'Ph%DZ܊L"3}CyY.rG0iPxamiT,̤‚.=א8% eK,̜pP(9T { J]u#  1텁.\-H2 PD0B[T㘒XEX o woG[{'ܦpw}NøṂq hC位ZcLZZUnO50)u5QLt q^{ܐNs@9 'k@g)S6q%K9Wd+u{_Litp߃!^n k#!!K8/Q'-|?>`C,wOֳ@$IL#4bګ`7^G`^?F\Soc,U6[rs&-ǒJ>@,IŔd'-1DWdKiϓ+!KP<~(&pOL|qv>cw{*܆욝| ]PD%7X2r͓2hW-sV'>[n.Mq<5:TApn<+^X]B'#x}[q:xZ3,{E]$f(+ȳ|"i-9G}M8B3+rMmu#E/4m {^` ɡ-D\INӓd~Ўub= ?Ɇ D'^s5cKq4UR_s4:ۜIT6DxЙG)ꐡO^$iP~ED"I8IsU\!RfrYJf.sI(u06SYFNHOsh4Zsւ$ Hmr*ݴOci\'Q¤yG`w fHHJa ꠍIu;#ez-Se/D%s1ނs2 !0x?ą{:ЅJbsq[˽F/DǙnt{v'.:M[=wqW^5Ě[㨮%;{{gt׭n> 7Dg a),W>HX֮n塋*ztH4W)ݢ ??}PҖ݋7 x`>-͏n4H3h<`\ 4CI+>4V28ۥ\2_̈˿wUСP{2x7u?,Oկ6/Jي%,g+ V2g[~ev-$JƥB`:h Jw78~ ҵWn촼v$X ™xB^f2jP}KZ)ЄF&wWU|D#,@|7<ץzZ]ʻ~]>OR34^~KWGdX+*`9/cfR42tx$MBn3]1?N:h="֝QP{j͗lE| {Fx450"jEzϡvvV'gF'mVV@`ɭp@EbcWL$bbxG h5z 0 ~S ??.#ž<JnyHHR7DTM ?/l;F dAr!i#71K٫*g9..?:Uʫ-C9P^;ⲱ=*S=;8򐸎,& NCRt 9 ׂa22Wq;Gڷhypqqn:}*Qe]&,ZnbzyɫJ0!-'7W>㫳Q^w ҫ O~CJ-fK_g ut%< ZK,`]r!0xVr'LB\"Z%VRޠ[+I̽;odӝ8L/D¸&%@@Is1ւ`אaeI I%H\gC U7*m~&pC; :;Iּ+!#@Ē'Se.Vިyxb\UQyvx{)qZK (dIOd-[8S$I9 t%=$Em>\ ,&i riR"Fj ^.dq-|gbLs> Qy#fytnǂոPvQK;9^AP*0^vF윋a=`6)LtI_;xŹbp+ҐI2䊴(If"!FpѐHE@H`$-Ԯ\QW^kǡ+#Gĝu(CL@>\RhY!{2'9e ԽĘtcJr I* ΄ShpKeI e%XQvEÈU*lո@\lw@=.~-6dI!pL2B;H!uT|ԤyK2s-RSa58ye<@XŲ5Up\d~xNئ%(sRv)5"x.z=Lr^(5L4nȧԇ== QC@ZUkQ؃|I*b 1p32.3w%CGS3u}Mz4nkt9({RW}؝zE l"]liŹFfȠchJD2m~VـGΡO|[|r~F?ZuKo.u~ѿ^DJՕ=|JZ$e:GҲC@g*{-sвKe(ZVG$# HB gt:} FK t]jfB M~A)*hb&s,վkFΞZM8EdD4ϧWإt?lbɃqc/ ;^˭ueM0,cFiEb,epƀQۤt 3b8`c{%mryHZ樬#&lWJG, D>muVqݏ C 8)c1nֽ2% .{i h]V9AlSfeXTK鍞۟hg#rh";Is' JsvсkhKiHR$Hu7A,vVsZ c,! FdoPZiR3:,cLZ?֖&UMhPV\MAbДh3$$1R4Zb{ؔйkjX=il._*"O^Sn8͋gNG@pG@10L\3k2|a|}2_^ga8+GOUd `]62~rfMnWdr֔EɢV4R X^%fUf[_D/_n l\/Iܫ yԴtBfŕUI(vpkNmy1ѯo'?6^]-^8hǣJYbW'ga^1NQudtHJ?E0{';,oh#&L tj1GŮ99`G]MkιZu^pG\>?LgM9WK ~ӥGN5c;:5l4Y>9ӛ_W<}S.o_h`8$A4 8Gpg|CsYkho5tjMY ϸ;ƽ}pzЎlm6o?_jc ^ݭ5ML9&iV_!i_4rm[TnMUH 8!*3[;uģ$W' ̣ܨJҊy#b+,44Hr6 ^Ozv GS.nDggy"D-yB*&ft$X^ʽQ9Ty۝ޓgkkbUi\w=μzPɉ!pur( 䆡,z6hx>'Rρ|N<$ ),.`Zhi-B>D:t]?{:- {xh;1 w$km+GE'![`{wŠ%-y$9-]b;Vȴ-D9ú|U, \+*Z @e/#ϟk \F2{ a "eQ:\r.h׺z|T!YkJXLu)8Bt)XiF+A,,ʫʳGPY(NȤ|V%A bu(TqI C4QMp8͗ǒk& ǖNqcl+陞{Ĝ@6Om%S~ħq|jCyAdun>Y 2 D8D 0 ݐЙ;3C^-e[ۿ?]\LGVxEfC/wcy5^PCFwEWOFRȑEeNWT&M^_ɓWgi4'gj<,߳OyD&<9ukWkt~^ԱTs{5gpS#Z&#_yl H7´@]enp+}\iT.,?ZxmqR#X~NJ5RmCJ4<HBdBda%\иP@D!X)kb[ ilrB˿d=:@[ ``CRcNQA (Cifù=M6q>;Cw9mnpl,k;{7A8Z %]J5 ԈeV:] cC/{Wzyd7f=\?U`;Y;YWK&{.c=<|z- JzӳUwfE&}.x j&!ڽny2o vb$.aU~b^ ]W nOh_Ġ|HeyPRD{'E)YbW$ir?)?fl"t7qϲp#TiP}hv9m!Er lut'g@O:d0+).O=WUvrYi2S6/c=IwcTP7,農 gi9ժj_'|?BdyǖW?YR=4QQG(E$=MJep ,(XIY9ALr9_sK՟<U&⸺'usϽjKw)d>}ߩ !4CMژ*in|>Se{MjOX>O׿y1j~WՉ>r'/x'tdipᄐWu $lɨyQaʞ,BQ:;hbx4[IǡP60{x&~gZ|\,܋&9}H2r"^HՏFku5UMڒ[o0D]ƶ~ 0zΤے8>}lmɅ8Oі5*t<  VdlHʃAFm),GBJ2AGٶ/%QX)d\VoRv2,i>$#K['C gB'*4i崬y#oqۼޡC5*=;UNZ@{YLvU0# 6fIxuNQCcf;0׆Vߝw={NڳXvPn3PKxeMIx0 ɂZnH ebZ}3zl8{Q ћ}EP `J 6 "i),:)0>+iTe4]ZE׆z;RFV"8 E:qM.J]&Jڳ0ʈ$} D¢A<;&=Nz\vRvŢ6 !T(]-&K EfQ]YE)XjGꆣ8*f20^X)&_J,HvB&,_4ŬX !d"(Y.#c /ќQj&"*j|=ƀ(`3[N,(Eҋ\QHzS-;$Pځ$Ţ:v-x˰ӧDC$bG*6q`uUwa9?Ơa8hz٢s{0ANg[n= дh# ,CK4jPER du6vmWs_X<ۡutL-e (t3TƑ 5Fa!+ڐt+vt=:HK4l$^!yhQY_Ġ8V0x){"Ѥ#UA~7o8\;q~u"k0*lĈL/;ێ i9EZӳ`xj6Yiه"0`xWaԬILl[r?'#&:?< 4V{~,ޞo=͒%Z;cy5l0[PrJR&9J&=6)}ܼM 'vOM=F*l +FypRPJ* % #DY Ɇ/ĔjdۦKFVfER`بq#Yx$/ᅱA^;1}ڊeI#ʞx!ɲ.۔-pĶ*]_U!-` cmC5v u kT sLY%PxxHJhgZtw(|f)mv}!ەkLž 7 WaeM-Ppx9*K !,gTس+Jْ g0_弼\pެV A\ me̷WsjڵbX^+k)L@yMRG0XxTNV"FMsFV;|x&yW]sM3a^Ȅ SsjOTVy0ik5nSkMqv5?)'WS\ )S # )` R*+,BD8{-4PޤHRCR[%}J!LE N&x[vΎ)+Q)#Pj#Lyb5%Z(Ku&5(.o|ǂvWYf*K,).<~ @Ҧ.{"ak迾(, G_Ȋh2N3(ϿZv̏b8ax *[7K|Y޼}8y&{s|~.k<ƣA.lxuy^R{e:)Dwk`mwIϩ$[ [@ߜr8$6II8Ygq4 Kz u'L.e͵r`>of賲wN(*Q72rYq>J* 0O ч*~:nЅ*zٿFCxx=Jό{ݯuL <"?{3g0}'[NgY .Ca(*_>h_!;ن /a-N EA(NxE^1{:5? H hPRENW3f3-,{WgKb.g:qY _[e)Jz=mZ*?9IJV%c{: `ᬜfT(Xh32w $#)$1_0U$,7`d߃ܢ1{ڇ: DThnDR >}qrI%lz + ޵* '' Y -spUmEusǓ~i%;GOŬfȰh<2K=4X%H`K \؄ A2 o:׎Y @:F@RB^j;FΎ *G\R!aMD0#^x#"!I X+(xFLsZʅTÒ1D0*T) b!A#p Q((RI5[w226 _'kO8Dq{uh+m8N^tO x{~X/$I&)8pWȀGC΢39$.(q?AvܣoL89qsI83FXtP)F`EAQʷ)ϾW>Fvs|%@Cw4!<_i\Jy:y>6v2-~²"bԒȮ->֞WZ!~lzh\ϭ 1gܢ`s漍cDe??i?O}3hEfN*8MW,:͕ӊk$ؔ ÞUo8 3UnO75|/-|n՘rGΧ?_0%H, Uxj1ǎ $)IK,Ѡۭ4*T`,6ʌ ;6`#*$;A Xmkg}r<q9ګzܔڦ(- vEgYe)0šB#Z6u53|άt$ ; Y`9r0U}Q(cɤrj"脓[cѤ^#( -g-iwLQHYu˕aj3ւ1&ye4zl"Bj4BZ"ZƝB&E6ׄ1 5q,ZcR lCj 5myLꎲYcRW uþSE؞G>l:ܢ({5T}0]Eͩ<78-"O;D؈763c0(^[I\~u;{ `ƎpRW*[0|Pfϔ(/;{[B3J7!u'paU+ .W௟ ]U$~Eӑl̻^`T`j/+'SwH8 mUjn3a<ᆳSW8xM]N'Qe/,56ev{Y4*;b^OvIWT֎6mwer#kD&5I') ؀\LR0:y5z3yalfĠo7fIF02Wd7M=ۉ'-1(/ܼ+g=fwYwO[מwzCli6 56wMӺ|szyQE?Li{!OEJ~;$ ϨL"7D[.W`p>O[}Ft@[u.Ղk"E^g$"b$8rNSQ[ TH*& 'a [FVxRi%jc]>"TM8ns͸ͨa]KU7+C|/ Z!#,d8 'N A@ !fNr*D{i-/2CO1`L`8u d!F"IRΉT:k0ҦsԺQ+]NJo @bO<[M0b?}=G]Y`znaft N5KuHm1:wE@$S<(dz;ճVt瓳bЍMS 7"x 5`%T,)kA%ai7uxԩU5ǫz4K±R)xCT"cW\ID4$J$hǵv3gZ+ЗzZ]G!mL=mvݙl /ē׃ u'Ɲ1- QoA`n9|ЖmPHfR۩DŽ#ʭ}\NS.(Mmc4&'QZ# TIf^z֟W{T)"E"* 3kR)F M er$[Ekt>%?EQnm`u*wcU,tva+Hwoa/~yeBht4M\*C~kQRX:JQ-.?1 wIkh35v*򤤞sGn;knh02ޚAjRފ"v}bpQPǿU1`WfNMCC {lo49?C/uꛣ0չuHm+{J<^)J|-ˮw7CHwJRCG6f>يG)'n68=)]왅(\ wtY'{~fl,:/IX3A;e쐍5#c8PTˤ9JFӐxi`'&9t>TkoƳKI`+L K&0TA!5C#XZ8  x'0 >J3 ;Gpm 3cx )a4ZA-W!cFY; (+h%{gqoR0#4'(xAc5vv /|0^MJ>\ ,?Ǐ]6@ Zytxxgx>o-O4Tg{UWהLP*r7;=iR*9\M=jf2|gz=MA(O?dzb\ULe@ZΫ1Ǔ ,IS'wbFԕPlbQs"x}mnjCcin+?y22x/~7;:ScX&Q)SP ,Er;t?F$T]O)ĕq7G'Gh g>G߀ ?y&| SQa W͸^ZO }WZqޝo4`KL*4a` ſ_FsztײqL'H7./ ve>1c(Uc.wr iɖB!/v/=z\7gUG1f?Wz C.2 '~FGN@Ι,g1sjS DV*yUka4p4x.XpNDan5.m5H@#S.t%>6Zuwzn%S;E48`4 c5Cs]*0[&pʻ菶+5.aר(ZU=:c979]A.8[nټfq|bxق*JbsεU245CYnn*?dSWػƍ$WI./>̇^@npeAȒ#xWMdiX"[bUŽ Er\:kc#8MA5('ږGŸWoVZ bι[(6{L,c05%aJDh:vFzݓpu=}0uHP !30\sO/AIcJǣ/Di+9&u+O)u%*gH^"ԗNGWzf8cz8rÃ>2#cx "^XWEg 1NUϣ0G%t$tBI.]@0PҐ%V# $mm&ፐ3̜g =8qUY/ǿZc2 yGCD4յ]y04*ڞK(SNͧQϗ;J⪣R ^x%B:1UF砞:hў{ ij#ʹrYn8eN, bXZg0m5Qm$!$ JB", AJpDRQ 6Pd4lg9jYKߡ)gCLXB뚢":#=9Fc}x^*D08J~0b0&KQয়Y5GcKF5D@ǹ1up E" gZ$"b'C|PQ2.BFEҢ&U12QkPD oSY$;89Ў/)tSskv|떆 g'˘ |: sr3#-fO؜OaAW m'q^\] HJ bɌ|,>si0&d =x EZ꘍cPα1.} qO[Bz3V_a-3#_x\iviƼ/ʎb0ȵ_zg?wʹcQ*qNܮvZWM~LkVl|n{DNz0`סWaUi&0.a^ɔ_dX^r+l2HJ_+LE,>ϽD>!'uJH{Ap,)u 71*(Pʍ"6:281rbrؔ)$1wl_|t$EvxS5`}֛Oy&a-FO=@,=Oܿe&^ǿD{0`;TĩT"LN, |7¦=#\Cm ywG [' zTq EZex?Ӗ`R *X=Bh8qRV`60!2Bp#$]N1|Mch \éKbQ(4Ny,,HH3/-2, $Zoo:v^&Wz+@”j-vw$Ζ^Zx<:" oA 1ccFjCl)Nօ%xfŅ\O< VB +nDDq%@ k J!YrSւ)Jn%.K±R)h!*b+$J5IH: k0fdN|u[Z1&CG>rGvc_@hBfK9x2$̽ /܍j?ލSKoO܀LܠN~CzWnqfz͉YѿcB-7έ7ӟ\T mH:?& z*r= 誨6*:r{(5Rukkpڡ75n2[DK#K~DԺ;3LZwiZeE,洀[i??]˫Iv{vYPY*`>!^/V!,I\8a6m=fNkO 4"ai `*(gMlN^iz"6'`WKutm[(8z[3-0ӫO%|2u\[:IWOԭ> Iq{}B1^򪙲̤w?:5 4LA.#X+IBoAJӬ؟`z}~l3%}Z%z-cvi8Aiex雿Imk0ߧ.IڃE*n5fݽCi6Ow\h}i"8%VlS5v PjzC4#3h3ަ䓫"N .;yW|9Eckq3דϲv*kFK%J`r !w9\ yl~;,gyZ<ЗGĠ O+ˢt+Vi%-cQpBRVneu3g«3Ic'F*K-ִ{dwx:Y oU7 /e= ;R9Yws*B^QҚ KM8z uDTKQ"Υ@ڛTYI*Srkm0 ԖiDRcib0x3ldSR9o)G4U&_yƀBA+BY#7L`Qx.tS̺ $["U<(5L㫐&tū _rq7UU[=曢#߶?+𳢚pUy^+z|ͫ{}iWUիY{,\0b%omkte!eBOm/|}Op=o% X= {]o'w*+(߿&XHJ—f?\u%?X3E >+n,OyUbHLq2Yb@RTͨLk쵘icC\RUg*1^O ~/ujL.]5O"r B= 3-<}^Pcٶ/^'ѢM$/a-N EQ𑩪pi'#\D͹/iD0 7]Tz-k\Ղ[P{k׀l~č54],9^Vn;==JՍKFӻa;S$ ㌬]z^!5y0Td,!0:-R`SH<>f P\llL1m=>Y}uv%|N- /݌,) ⧔]+Fs(=ja ?j>2bBaQdc2,a5 $偧 P[:b e(&MS&1@1kHK xK 8]#2Cl,:-EQJt"*: Hz@v01XYa҂i(([oU#2\* X Jd"Ĩ )d4&jci{ _RmWbijl~65F@ʃ3e@ $`! E3Ӣh`,0dQ[cޥ/־x5x]h+ngxLow19Qg_ʻL:%N T+u2Qh4xuq\"%Fu6FA.xE )K$)P*qkS1UT;/; 42$FF!`2d{Y:wnm0Onl6/oIR0o= %࢓iϧ/O.4D>5 ::@R൏Ⱦ69dsrmn^lGWS4ViiDuS|PLM{\|vЪ;۞мVS? ;kAfē Lo8vОm*-1X-#;Q(#ģ9 |y!Y`ȌXCOЯ7;j .yzq3箃WķylrݔIo+0&Zmw$ξu}>PߡY\x0S$;6V&kG)>Xbv-{W_|8sx>}j+ WdY"]w{[)ټSq]}{77|ڝC޿~uRnY<ܘ2˘$* qD$8S34۾ I2$h';C* -KHZ˗mS,ƓP g.*)9I"+a :yJeM;+w޾EfMmدreഓlE클n|W74aoP#a\2&cQ݆y s%1/Gu^" E:RjቩtNL2Cp =)Dt6Ǐ9gm`vQGZN:.|2<_Zg?*W٨+R@T b;D)!&J<0*ɍSj\U {F\< w*ql7%3 !6 qۇt^v:8uQoۋ씸-w/@]zw>=fu};Ϸ0¶?]nej;+^y[L:a+A=3B ;l:[֣1v&kS h .KZHݟ3T\'^})': N;MUtIkN`Bcrc)QωԜ *cIKzY> ᾩ{`2SA s[QR1pUeZk+4)aMIǂq[.8MB]QEE׆U(p][yp$qUN5_XQuaE,xqBdf :/&'71\ߪ:wD^^5Q!P"`E>>[r3kc8HDmL>l3t` BUzM=D&TS Tx+8ehz$y:$X1@JET dR̗źe,Ub-ԅmm[f;vۋ;RO~_k7M+Sb(iD\r&#Q!%F#Di"@C IpY62\ +fg#9l+@43A'vLbDaT1tRyb(QCŤc_ 6VGJNTL Ĕө66&<O*TleclTz dȈ @9bB$^+1%օ:TGmKg,֝aC1g0E,%ZDS""SXKqp7&)ɐ轶DdA%u: hbe%PQ:I GЂlg\pDTQQ{dLhn5>źCEW.NpYLJvI|.i:GSA1 R A:tn>ϵ  V2rH2vPa1gTUlS' _)(*0mTds1k:h*k6Ԇ Q{eppew Eb^,Bp=c HJ}W-+LuD ': VH򎲔A4~1(腱42DmJ^RcSg*5l[°CFڨĂNV{"Hjx `JLTD$^*+`Pg1؋|[ N>SK ߣ>QI됮叜-w\kB,'oZ&,2_ZXGe} ,Ku@Tϲ:N*5hhK =$HDBRLY(1E&l\5Qs Lݲ1RHB x,Xp˨Bd͔CES0$k(֝Yʚ~s8dap:>].wK^{Kyr;i#wK<)o,Hsds!!(e R!1"|Cck~8LȭW3"Z7^' TX hhQˬZӷD@Ql`e boYrSZ&.s."|P,KvrGB6k c5v+?W#ˆjкf|C'<Ϊ禄%xђTg|SVOaZ󗻮mN`{8mݤ:BT7~io)( N(+rt#;K7>^šז" +OHt6R{ҋ[KUk?kbEFgYht5;âpqclcH3_5xvyuQMybφ#]ۆ&90AVCV\P j$G֘wWٽlThk..]gw;#(]Ke0JE;횃Em|E{{aGwlI-]v5ú̺|@E q4nlT.86 =isp97802rAv9VZ性ԑܰDq'|r'tNݛT޲ibUQ/&|vM!w{W}jmm7`QZ7[nVMdŃ *,F5lKF_<ՏB3ꕿ BF6xOID#R;bTȊ{ dr.-MX8FKzɫ7nXgi@*` e74p y.dvCoMGtESd_砂wWɳ1UmXՇN.'tgw](B fZ`k@P󦬣^i| A.ߚqBc79q4 g]N4{JjkiK m]<ǶX'CF`۫[co7~i'4t1]LASt1]LAStۣ]LASt1]LASt1y<xɺ.)b .)b :HHFhK9ƎgV f7i@rRFs&~JVJUR;mnېd;wi871yLL*Л,I6eqYIUuq\Z< f-a9B!()reFs))&֝n">x@ֱ5؆21vwA7K<+_>ϟD}QOWg\f[OdI6GGul9i;孢s-4Gn-lgA%:\Cט7ofߐFLF`6mGrQ P9gQ9Q7Cn 0xm> pi:m WK#KiT!:Zl#O1NL(h2̱֠#?ЏV~ŨkyN7gHm}}Pn͵ɞO*b@=`J@/tpO ZALy<.;ihM xhĪ1&}\^Kk[{RXQbY\GxuY3qVL.SB$S($`:YSC)°ܦۺ 18-H\d hrK"sEj2l^GÇdeA36uaؾk3\NIO뭎$s0P z V5ӆX^K'!l$P7\(& ȶ7~VTګ Zw6+j:,McmW\|ޗjrS6IU3j{vm֚dZo차;uWxGίmxR!BOܾ8wϭMNeA6EFJ 7$6P2s傐\g@۪Q1fBR㭍YrAHR0 c^D9 IP2VfXT$cW[m!vQmGel˂p.n͇ ''a:  F 7nh)29YN$XCE"D?m2*2{ISZlid& sQ0齐%-JAaF혱YM`u)Ů֝ۏqbծ&ZmSjjw^ֱ{1q/кuDLRXÕMEW76+.x8C&dȁXt-0ApL,;s>f@:<Vf{ؒEY1{ckǮVEA1a=_ l![B@Pd-)qK SdnK&1fM#"\쌏>H:i/C &-CUպEiW.Nr퓯hյ:{ D4sA˒ƚJ{DtA/]cw-f'8w?ky2M"69 qrYEE\Lя$֏~tᔬWCn:J֌ xpи? &XDbc@T+&.sK\H;Oy5%Q3#]2;'T6i(IPI%;4YM&-]G9\kw˿֡t=G.w\kk%%3,lr Eh\2Q3xvcY/e=:ՇZwrP|Fr 2Zݓ*@V6q%O9 ̩qPKkՇzsr=i%-:_WEGf3$RQh !;A,rFkgePAE)|1YMS+N4ap9)n, ^R P'dMl,` ا'G@_ @ЛӷEVg$ KȰwvV֭6d)HDzO!e& e=bF8:c̬JNEjlVxN$ړZ6ra85Zyx$R.JN*N9Yy˘ءN:%цD}Pe>|ТK5Ԗ.5}L5H_߆^z` P|]KAioGI5Y_Sʀ2L.\H;*IٟW] +%Ŷ{?ꞹ{ލB3(N&8㒉 vpZJ.ÁLҹ|>Lm͒,ѫX. C.ݍiBḍ)JܼG.saPϽ ,̾^6^k5\!V`r\<)iĬ};q֊?>R~]^_[7/+߷\NO]g;ŹAt6^튃Dm͋ ;HغZ[:_׌XߌkY1 G,XVpy1ѓ6}3vmbZ];VNJSZHiXs4ї_W-ͭr@8%:kZN=)7Cɢ 8kx2BZ%7yK2;bnʨ@XBSe_gwywۨ*}5euXՅEytOs1p ؈,[h'FYcR79 dK*(T'jIطSq;#۩/1b):$=+'$$Ei- CdRg1}DT\K!!~4+Ӗbuw^ g>]1H4NJDZ'4(,FtIU@gC)cw#,'R(R$)4sEĔgv@d6ѫ֝JD]9h uE[ߴ!lqMf$~UÔ8!sn[:LZmc^]: /9ը})Ûk@%XoA :O!3L<'{2suU EEE<s`QQ8\$vv-T,Gkǜ\bZ˘BN Oi@ST gb 9ےpqs z|2 Q[ƿvy(bh?$ 7qGo6a =l6~NIhd~>~4LkhAillG Q P9g+ Wԣ&ٹzi ^%S{orֿ k+kJ<&-[G-yZJͯ;tjo6%x8)5{?rI{t#"I?.ˮ/9 j_+BM6^ydJ1B 12&sMP9$DUG%guIG˞}Sdr*4c&E.J񲃛3)p`A'NcPР&IO[Ƭ(Xї;j׷֝ʹ}:/$_%9A6Cfwcd7 ^$>S$͡l艹$^GmY٬fYW:q7n2\TnˋWGJ`TDxJQeMNlX8=m2Ff 1 <3ۛ斷c9?zvrdڋdEc~e'igv¹b*sx f)K(NH5aI0$jm W޳$ &QEN7qԳK,-%8sXA"`$HJ8mu^1%Ӽ*ITףS>(5{'VIhյ`Ɓ`ɤ0J 2,!- VoP/Ƈ4'=-~;|^!T!a9ɍ2[ B ryqٜPEx>FȀ17~ AmN 6CU=`(.ӄp~q BY㣤Vb 8!T$<ДmEDP 23AW|M*YBO{J  |.גqtD|+%RKR"L`A=S%+tvu( vCA*/n4Y6)&eGȑY̟ WꉸR},M(qukK)d") JFC#a}M6%XHr :kYn7T-$㉺df3v8}=X2'gr2}@-a}(+~G?>^'޴sq#i}ðquh_[g >wH㔘S S:STD.iJ ~Gs8NFh VhUt{mf9i_Iɒ \z8xO9v0 )KI0Sh8ykGL+( ^,iy3QoEbZ6皫Ρ3+"e`u< )+lqgoYvknD Ѩ-=ZRZy u,.{DaI07[wQB7~2rIfǜ_Bmy:ᢓS9Tu.w[r.yL8΋S]߽!ķFRॏ D7 #kNA/nMRMx`MxZҦ,v/v~0 SU`.dNQZC75;E!LMaѤ1Ud =EF5ijH2uRB[;ٷhuJn% =0 ΋FB2NC(w񵋦\YYm/Oߦߑ>OfGPΰr`[P^~*D|H,wv`pe[16f֔w f\eO8T鐈vH.o~{qߕӔ AM;IU]k]M|P_ld7͡:=^EbnpKho;K˔oTX\%lnlMݓj>HȆtaYviҞiN<usˊ#1liG[938޾j8C iD%ՌE!Wv Q) ,IrQh]?;^z W aæO`xr"i5 dN$0ƜɣG|&+TzN`YyPuz*SYg߽ VhT-G)OH?iE,Ѫ3ҵG~6:\z(ߟ_#-U1rQB)lE9(\!G?h{i x%g4VhةU!@;q"#s3St6h.P-hUq @<1 (+A3M}]> ׾_ϭ+!Mm!!Ol7LSi x*+^':3Bq{,ץ3Z^]87؇y7k[V=N䞾,=|fV ;#L3T[&d\'3-rYm J%M:B-IrnfNy A S)h p:b, .Ѿ˰%N\ސs{K%&(|6x>4*M>̉O|@:b3]d ~Q)TE?]20ż(/}N2/|hLtvVP<\4mۚ܎N2IXcw dMj4?ח_tONҒuUxs#]dhلk&`Am:|M^NnIVw%`" ?B:Q <&bsKzTx7`qTb,<s BCXIIٸt~ 7΄:X#ơ>'PW΄-b e1@.ۦYSK{U=1 AuLZp IGM@c*@1uESLBBDAttl5BޫCLO)oۄ~ޖawvD&UCA˚9A*ϴc= TR,E ֓Rz&%?Ɔ_Y9D^ODk83%bztD2el :D:jl2RRƹs$)NUI$%x5Ĝô҄Z&nda)M?~̀kˁ!"o"{p8c71vOih̗377-QfhBO qZJ4׏ E e p3Am<)E\Uݬ:96vBFZZgtт RE;Ӻ]nj~|?wqI+ ~]J ,0c +/30Iq+`lUQ1tFmmKkG*4W#ƫ?Xo;Nh99=޴{ *8֪ş[z M`,^熆#Ŝ]쿞.^={o=p A{t2lΩiwh^ VwzqQ aҲ7dVr|%B?lexٮM_r|lcȸ?075rx?pc捁[ñ_i3?S`Ou(VG S}Ak3$rp͓٫Ó59oZ2:p.vX'>jij;/.^.7Q\m]oS8]3|oY]l7iµWot ǿV~QŠFJ 1>):1T?hd<ٻp)<`8tNԩX%硫Ef(!d[T&+lݲd7et9ܶqa5TCtdE16/+=Ն {{qۺФ)"o*o(;|jrglDO[4^ ͟/m_|NGş=6x_w%rxv9hcoାS䗦!`ݗ`/Um#ﷁ{j"W܌Mt:`ܐU}Pv[~B}PT1/HvVc.g{t<4~^b}ss]̔ { 1j؏#ѵrlICIZ+z.蚬*K#&v8zPyȤ ahgV&a6*a ֩}T93L5f/ zHEk@n#ҧ*UqaE]EuB`TgK4i?a7ZkbBַZ B2FOչ F4Ќ}WzNQhYIkѵAǨ:2ӳ'NR֖v[!nLVg()ۼI=FIKPd0v7Ccv[\D\T5Ci)KNI8_`--&ڇDephav:$+Qޤ*Cdg[D&bw!0ИUV{mv ur>:x! @!3h=rE!CFzq#X /ψ1'B&4=9(97'M4nֽqN4dhc9eNJ} :U29avnE);IM -ڒBGZGWH }fN^D6G;s7$ԳŢKS}h5ST:8S׾"fnJ3HT'.Y{V RBv$JkCvYvӌT"yˌwh< "[ݫ N6h < uԡnӊ#Xe~V7P&ܪʃZ]ėE˳ǚژ[]J2V!:k4 Ņ5Z0֕Nz\s [(0UpSh cXa=Ҽ> k9 ZUPѪ kGU((|hN0¤T)SscsO[|n]AAΛ:2]:Xi65lDC4GRF4U rgfY#/A_ż d-Š.kI|ӧU6Zqs9(P"Q _C݅Z:$ճDTeP05K q[O κ$!H;V@@` )-f)c &$KpZ{65^ PP2Б5;{Ol:ǝ0 ʰ"  e#@ A16%VmZ?aXIwHgYtg#MGh tg"-EiPjj5]ZZz!*¯xq Z6H*4=Z&ڦQ}AoAK0A Cj)h\6csw|q ެa\P6:L%:jNFW6'zlnEBb\?t?AN8QZ{*(=u@B%aIPrm⊑4 VA̯:^@\T"Nm곩\4bUf]" _%d,<¬$"T$1DЅKϝq$,X yۨOp]ghuECPޙGD'~TQ~Ջ6Bǻ|DA!#>$icx}~?3ʐlhbRТ'HnfA<~A)aTF< @0XhH 2B.~$Y<@# I &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &>_H%c"4{<$cV3@0 Y@$@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L}$FD9:[@}$N &>GJCi&bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &|I DEd|<$ѐ@֚O Y@ bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bIχ>[I ՏՔZ=no꯿vj_,XAxk٧,Ҳ7w,e}lgHeuz@>?MۍKr 9,g_l^+?j8;8-mts9p~zr.Go>e;TM'5#[NФ5OgONڗ6c׳e IbVuޚ_"mL{Ǣ|!d!舡x9ϟ(v릗B7閉] 6y0):8A+cFѢ<ٻLpuJ`0!d9EP\銟 !4?[QV!kAZxr/8B2ޘ;Ai/VgI-P Iy;lQ&r'3/QG#YϦo@Z+72B8ZϏ^GG AO!2ߟ^LaNw这<\&Qů?/.c[φgln)i׫Lo4kw]W劉1*`tIniQ+ UH)]?'ڠJUҹݒƈm?uڏl#^tNs*혔=i5'/$xeמyՋw%mY4 8AR݇fsٝL #uJ2#4~]U^c gu}{E/ wb7{}҅;LI5`=N]÷t0r>.; /7֦iX<>YUmpޙ% ej .纙Y f:( `x1u4)S07H<.I3/S$PLe 93 l cYlnrU~VNnRw{eM~nپ`Gzg|OlC O3Ct [>HLur1#!))n}OO/@<)#ڷ+UM5UN/[Zaےmebdok~$$;~&AvwarVίk5NMmmKni]Mwķ5*-{3SW{hQOSl:.Kmqv3 Th'*_- KT!Gߚn?ޟJ,P Lt Ͱ>;4<f͌+#ZQrh D|_$Dxe/.8Ծ4!_~;UsAf0P>n?ָw$ƂPQwVEH]ck>o!:u^蓮nL"(#[<pRdH*MrVLSS}fZs9wEzl\s}i]+ kmg&ɍK?B1ocf")CsE߾k'eMODim6lڑvN .woo딐I-1G}Ϧcn'Y]eQOݠuԠuĜ Uj?eOTVy[2ii 52c+NާRѭSyGүq+8ǙVZŔք3)Q $đT0B %<%EMl3G39|ySj-刺Tz3:)/`n (@\q-:rJb9a4t+y)MƗTE~Z60)+^}MYx#_UUQqnƓiwǯj:͋F+0-p=jo)z|ի}mWz*QU꺻~ 5n_bٞX~,0sXYlڎ~uܝfVoǾ&rRߞ^Ey

Vo'\3̖{XfV7s; zD^'&97l|KnT W_i .GS b;' ksփF.u{aƉY@1*'x 0"Վ(u`2iIN\}u|K~%|I?f^e]RWdruNKչwdXhXZG4vj{ֲC\sXpY}zl5CE^:y*WT*S`2#V XV苄u+0!S&as5*gy,1M/5t< }s`>dL E)aHpXQ9̈f\cD$#kϨidvdӢhF/Ǹ&Q1@ LY&# 2! M$NFX`[ڔ{V:qs>{=Uゖ8sTHK22QPyCa=Lhȩ9G#QwDsI83FhtP!t{HjϾW>UFI*B9DŽRnM&9DR( -J)M,B"sZd#g? ]nHSJ&Ǐ ptً, ?aY\6bԌHb-=^Cy-8?tSYZlɜQbS فP0+o|i۶!YN7?]P%H, Uxj1ǎ $)I{K*r4輑LNcM`BkM̨3hQ9b0XJBSkr{"g?oY&y;gXY{ܡ=FC8k/YW,_J爠8I]fƙ/saaR95S9#iɯ9T3d e95XhtI- 1h]J#( -g3!$ R+j3ւ2&ye4zl"Bj4BZ"2ldd0Ms RA1F- /Q0;lT_[`W]-٘A `&oEr}:M'[T`\\.e5sҬnPqQisA`u tu1*&.F̸% ]Kc8n_E3vR+bIP*s>auL1z75tɢip­ G))8 5kH2tTDW;Ih2e%F>]$zQQAQ{j:jPn1x6=8g ܕu9cC}]ǑIljǽMֵ=947TM:^%cZ^c!ݛ;Ik ,%ʩIj$) Ȁ\l*8V%N˦ueGɷzQϫ9t=ܽ)?`̓@>?u%֛GtZ"Vg;|fK顾y&GΞ-qHf9[^҆58m\nZ>p!rFh U`hi)1Nlb;Ԙ5G(@lTǼMb⃕!aCG" iUH"2 bl:sH%#9Ĝcѱi"]6p~>xX5}jVw?a YmaS1V3ED1k` /&ƨ%J#0n2НW3o>ͨ&eUu6߼~Cbj2 fSsj.f0.OEE㺼MX2tie$XKuVt2):ߧ3kA `*> b\er*.1]7//>@/IOq\f`k.PHxZ#F՞<#`HbaJ{S7Ͻvtx8?4LjHN.a\7-3UbtQO ײ}hfn>~7|Q&ڷEOWF~j- x𮟚ק=dl@UG]JtɬcЪ32}IE!2z:QGђ߼%>$rqIQJ&Vg_ҧ>٧T,-hj6˨ @2Zo52RJ$ 1š>PTG|feT`~ b <* 9N|0a&rroDeflgp3K o4PBl*[ݑA83~8-kYJtL9\+DMxBmj3=y[(OYCVsיO9h=Tѩ rxf>-[gώ?_PGq) (C;]= q3%ϣ/ƴ{c‹w/^^g_w$iO-Vs{A¢6O+򝟮'wQM-i-} lnƪ\XSzU- ͦOnsa|mpmlVWƱgƋ啎Ԇ/X}1gUF|,h(ʎCI-,\g׽RoB,c@RYK4M< 6I@ 2AlԘ2Ŭטi!ґ'#lxҽ;V&Ct:xNݸU;++dRCmNYXᰯd_砂w{wG\:KX5D=HFqB&ggT0V| a1D(5:]v: ȡ9;3gm109F@`iO<8?^I7#1ԈEQ< 1 ] *:gh-xyxO[@Cm,CN8^=S _ %-lR)bdfi`PoylCerp/- o_JH@JR>NKt?t?͝AF? vg'")S:,tשPz#NMaE]cX%gk(9^}i_c8jv)%E XY_ʁA>94Ȭ ژtWwo ]<sRǼ1:]^Yc<߃GB*m f`ނTВtBB#L2>Sm|;f M."9$Xh6*#Lނ$)*(Xuf%^e=eq Sҋ_t ӏݪQp.x7,Mx] k5 wV lw )]0:,;* q}P Lޙ-UUitd6=G9/rg)ݲ mrܥ,]./cg~O[B!(^}Ɂ*(Jfx*G>ʗ:M-zK%.v.p ( ՜Bj&4Nh"0h ^?:ÄBa½8LyʵF=zIe+Eԃ(2=c&12ڀF]YB dtRP;Q[G͸C:࡛e\ޮ+Pp-u< /vMV2s*kk" X<]v3D) |6"mU2d4xг@:[A9j7 9̶=K8 eB$𾩞yQ2@;!Yh";qBEaXCV;XϚugG=;avI*_>3>~%%z!H$&ɲ/>k]4ˠd]UAZǧ"KC~`w$H(٪jVFV a0m Al&O$/fpbr3sd@2-dGl$I Œ)@VH,"D/KqpjF/z~(LDU>XVvP]/c-VوMf| N:Qj6P^A=کuqvPX;ׅ5qOt[BHU@l@&Z;tw>gU׫p#:9o7ae9om(wA#Zad8*o^3;GPF(4xǧKH%IYHcS 4"Cq d){jШ0&v-pD'xRwFs?MaF?ܧ19zkȏl7!^I^zwtrq2vbH;I7 Wӗrqq6y9JH^j*W|ǧiʖ]]D>DY/wMo_k<1<%o]]-s~q6/70z"8{F NQ:>f48[wll4jϿђ]X:u LW߲Q|Ia'r,Wrm!{TY .pn% .o! ӝ;r$H6oMlJL/ Wb˖7;)Uӣ$Imt_Y؇n"*I>Y,RĢTR*dVTf_DEFЕKfٗ]&vB (F2ߌF256i;vK1iRx}Ǭ#S{vePr'c_azhS/o,(V `龄`̲#7#48V*?{'6leg5#~{쪣QUG>U5P%WA^`n7#pQl ig`1z#Joؔ+6i1+Gy`m7r¾>_/y HbқΘhT^m ;R9GRS?C$*e HgJaxƽ/:71h /iCUќJ_XL5,>OfX $ 3*KF(aqy-+`^OOszw7R91al_'?B#^qCxX|]ٴovM췻C^84 @R2 +/?,C >pK&1+Y1%SH,P }}{7cC䯰h6j#\r=wuQ*8AnQ>wRuq^ d|bsZʄ2a^w(jb w<2#cx "ňlp!lc:[ˁ3E.`_F(oV:4?1Nsx6Kut 'WN #)7? 7~fq.Q0q-307D@bH2E jq؟~G/{FW|PQ2.BFERP*AyJK5q(7ibG !vPyGgTϨ6ua~Xk65zެVX{v194_6ߣ S:w^#G`0{A4LhZzboI auǨiڮ/CRZa_6Whfᣍu6y#!e-#D+pd95^EʔDr Rx%+t/ґCY:_1z]%Ԏ&'!e=l,toꗏK[SPj P C3%hlrGJ,s+9x3d,XjTefTZ8S<zoʹ،WWJA3sf=, ,:΢ 4Fl ˃:%")󠇙"1+ U c X 3)| #riX۞[}ڼuYO|Ÿ7erCAāU>z p"m` $[#2hh}+t0(b:WpRyh$S K9'R9Kˣ=BC,'Al𛖃)A.mMHޝw8;<[;`i-escVԊ#<$S8zOuplhXBQR5RulEoGhp{[X|wBʼn[ETíK.j1;O?owwm Ӥ!Y\?K3{.fFyS"/*Ϧ#VKP@ 6QVpV\ok>IX-Ij,~MWr@.͹'ˬۥ\͟Yi$:s/6 \}])5l圏&eL7)p \W]ՙt6͸?[Z:vZ?Ͱf&@o}i*Iul? kLj.g}U=}'״k-}k3v@GkܧrcPU4WJ\`U}s:~ㇶ|W]R_f7]t:{ #E7 [28#T_\oKt q:WsP8z]ðIcw ]nmn$72df- Ԍz9Lc|7mfL)S=6_\h7ؙwXV\DD7fӬrgp 镵X%ǒ`%I;QBH;Њc.l<` 6` b/YBUrr2=0/xo@XsX1MdJ+ij=Eb !@.Ҡf (ā.~,{w oUYQr*fąUJ3 ;Gpm 3cx )ʑh[mHGCJ9ō8`1wPV"5Jm=KCJf/h q%=r^]Pm~[zOS^؅c[77x>VyiU3g«ZAՖ|Ӷ<# 7%,dXzgKA#kV|p oS@޲Po@^'-sY_h!ULiMx>6'TFR4,BD8{vs(joR6$L`AR[q:}J!,E N< Ό;!קBRhJ"&(BY#7L`QggT4FEvjMu^5,$Px<('xUUE+3f˫Mr׋Ӣ0+pz= /П~*Vf|坹^n/{(yx~Y\Ѥyb<~ӥ|셙˶ɛDگ6^龑%\%Ajf\x (ʛ7NJpy"%! KzN%}]~Uм*Xx9WfIstBQUL# E)|>gYxkHIzWSL\HKeE'Kx=.j/z4Wi0`t9|?l^zvN}}?ˌ$afT vF{=9#Z©0_e'XrKaBBdN>QSUŽ$u|KFD*4NE9;Lf- _FCڨNn\HݹwSM1 k ~)|,YQ%?ŻVIggdST=~ vAٝ4B5c墸 _/~dnXIpΊ%b ?g PT'kD񓪽|a_69ay麗ߒ!~FiސL+M7t4m[K=1vIaQdc2,a5 $偧"e KG$SwmIz_ {;l\}ZRW^RIS-!33$k~ʠd^ Yidzc03O"֍[c=`7n)։ ̞xƝ )p91+V =mK:E|ECBmۖFD"(JWb%) JKʡ .DoaI]o!oƌU `!Eaf0`$GBbQh%h5(JZ1<،}k7Xf㫷ox6<7ƶ~ކ9=V-;ylWJFԋ-M}L1YtLD{D4yhVOt"^ODK>9<=:-X)+V \{5i{CC*f珕pm80>~sgϮYd-fj}:v|) QLɠfSNxYlkԡ=:v~/[O-HZq];r,@(whtB9DYI%*Ȥ+ cUG!9),$-$}5+=6V:BcE)׍н;O W'VST9in?Pu2&""d@VX tP%VIo{)y0%"{L )I/ L pNJ lTnkyW`dz2p{k_Gq8SYggy}Ʒ uޣoq@E]fmtttEN}ayB:zGmӘ]_٫ܮ-2M5{_gx9хoG1H6c}CյGjh޻n)'x=G$n@>9F^(fpymY{8[lb"6PX4E%CgAHv%yq5q]Y+sFh8#Lё X@T(]Ib#R8V$oR1qvq`ɱb_ߴ]e5[p~@>׶x{y\~=TT%}E%D:(B/Y!슗J%,Ph6{l= `֫#qwZ;䦏ۣBN7l3Zz6}=RwyvtڡME6{?- i4:]:+ȘdrGG}f/-٘O>^LZP_FxWtqu;|mzW}zsZ@V;㑒2)Ea#>Llӥ0'**QޣS%/\ܺD60R+m#M.*'Nho7%vw1z=4r}ه˱#{Uu{|m^ϵ/fB-)uw3O0gk="fݻߦmVWbރlaxa[v;GBTI!"{ ̻yPP;J5w:Z0f)UL}N!ju8}x_d,(+Fg`>W0*zk3@sA;\ڛP>Z _M y)`Od *LA&H#(D 6j3q*Uś1-CLOlw 6.si=pT6 ^R^)B伯-TTeHmxYt%LF-b. $C;I Ge,EG IqfXL3ӏX;pu]ps\<-JmၮtvESL BFk FZVQ%ut,;v8&RheV^DUcPuBB(Ufʺd5sϮo'=ioK"֕[4ʹc_ƨG>2؍C:[^w ʪt>P9J6aS 9ͱ-ƺ"!33^tR̙$$1Y&TۉsaLT 5g;N2 b+}cDGD<"SŇ xOxA) ,x6r լauމ uh-$u,#Ngb 5YؿL=i t n8;$.Oud\/R쉋ZE-xMO.V4=v5UMB 9Pa31~xְRb:) Bُڻf r)uȇ)NwWEo~BT~^~8^ |'0u8KIG zr`{s2y=˳WE/%2{;=qnAg_E _τJCJ B|o.x80AG(^zlX}KId;?@D-N\]ϦeQ?0n^B}]-m]ơRzbqy,WF?_5;fxݠЏM%Ltف4yu293)OlK}={„fHJB󞲌g׷ өdR7K5NTwnK KNe݇M @uJZc'cW׫zsE7<-2G.C%# Mh7*!('+t&y4&|PbetbXw:2C_n:cbD~BmQk^W-nV6g7svlZFqFzFm<< XHyfkPw / P;<ᗛnΖ9xIG^cV>J! fpabmC !S#wy-"8DUr)J$/R(;oY #D2ax}f,$2oZ`#kyd11j,)ۡdDei>84i=A3jPirn!3RTiŬ7IQĘ``"_ !rН[u&>r-IDPXp zjVe` z$M"fxs0u@KFq:Ҫd :JrIàJQe ,Äc*B|_/̨" ƒV3a$rA V <*e6!+<]\/q4/!dzHXDu~T$ 8ՀhKP-BW1hFymC4YK(ThzaU0t}qμ 𓥉d}E>`Ye>ZbM=A%h>h׷Ajx1j !%:]%`|̡{#DTe@ PJ%l-ѧhg]+~ r]Zx _3:a &8}NPZl,b|3  #B25;5v o: EBiv%&@W xP*Z!8DDYUõ"PaTBHc:&gو^ ‰6bn+itԞEw64|/h%aD[ L[ v/GD^kopV 0iD_=UAKB0ACjɵAh\^/8h6^y^mN!tuynئN:L(*Ӱ`֣{@7[f= ]ZP[ ZR<;-,F ݚT}IΓFCoׄجPDyqGFo9*̈}R@k80)^z@mk›#* F m=AףNJxt A2# 5@JFOȠ r=ZD}z= -YcW,B;_!+q&SqrSpͭ wp._* 2P(EaQA!6J|LH,îzTg5(X t ȣt%h#*`hAgۦwZ,zF;1֨^ 6)jKt;e2 `2RAPR 9O0:'/ӒA*DEKC6l*!wUa>:jN0FW6 IѾO30% 7aEkc|DP(G=`/!X7ΦrѰU7ri$"&t9\0hj܄$Æt!c;t\- "KBE hh;S9L0@5G _Y|zzq!W5)8`Ļz:2۫<?|P t';8%Z<:8]Isqoͮ aӀp~r x/ke -(KpB}I y r/f\M 7}riԻ傤T⏘Ct=ͭ[;I㤵нt<} 0YaS#ڡ:}Z4 jKy`ڄOD-Q$O,mimwʥ⤖o 爬0ʚZ$GKAʺ,˟OV+h7x|`ͰxbY=ri,@r"g]JiὪB^ ΥJgnUN@;!ܶqP((6mswmS&ړu#BiGgP1N1 ^UiZX+״DˆM&P9b2J uJT*fYZYiU(Uj*7!x"7(wO磬9PiVU^[sn0r769+?V+ B od#'ΗWo'Sn@+F7Xɿi\0sRʰcA//QTеi.hBL`֐BjiWO4E/g_.[B;WRjoj&vcz᛼75;}l!=*+:L Nv7ѬP)Od&-m_yFiZ)+d'r4#rK*̾VR: e@`]! ݭbNH7Vqy8$QrcZ:SQ-!> arUӑ&+RN- ~]\tlɴq(푗S )Q:ʪeJ9BMg` б63ʾNeKM'?[}[^eL 8W9iHIݙ-N 0!ft)5uӔgR|gzg|,C<@ʧZJ[VYWƜc,I7|nXcjp`oC<+kʇxvg hJ*Hkz\OYNuIFPєբ Ou` 28N>Vozu6:[:#d4ٶEQCM^y.\(+xy&ny 'd9&@>OlSEt,F{ KF)(&m FE# _h k 師ZᅹTLQ}.bE̸1"f\Čq3.bE̸1"f\Čq3.bE̸1"f\Čq3.bE̸1"f\Čq3.bE̸1"f\Čq3.bE̸1"f\Čq'M-aA`gfoAڨv@"f"fFꌞ iDQz?K8*g Z]C?6q?B^X%WMuFEoMWBP@[iϋԯM2mJ>&0M|I?_B+٠ֲ>fG0Eb򿷲]O]>)[FC6`q^"n0qWynOmrIqGn M'ak'F'v߽M^7yazqMߗM铯wzsN<[_>K*q;X[~͏vfcֽ-vUYsA$ZdiZh">E/.4lqwqQ/&5i;81o( W(+aX}O.1N0ʬNZ~ X),4_5A3J,mv.ώG O޷ PEc^iSc&he5|uJvt̡3.;:od9Eל#|&e5ȱz+KMuFEoMWBP"~ G}],؋,oq@R_2Yk14RȽfCfeGj7T_;ﵪjz vp1efc/o _w<j8<{ Y6^jXeG̺(;+FXE0¯>?RϖGۭ~?x cn.gN2~CaK~x+N^F)#u[v\645aX3bo PO|!^BA}nc9cӺ7Ns5J} j6z6s.F˻Om1RHX`1k|!&Vejwm/b1 {}aAI3?̊ѓW0f}14`m48re(?Ǝa-Ѱy4j8s{lgusyTL˧w`ԇAT<$5Z=Cn/c`1v%ur2Ox)1Ex^:Q yXɀT'g/UZ^8^=à./Bg*Dߡ97~Nq? xGh8I՗''/}-ګgT重;$1^#'FSO:s){)!Yќh{Yy|0akY}ϤTT=Ju#Κpr S+^D郾_߱N{vu}C`-:{t*H oF,>9:IUz[/_֛*׫(juT9J8$*_%v[{ ?B~R]!>H=isF/k܇!kKdl^N,o* x@) H9E9=} -clapM0$7۬1k|%i}*u.Ja0y8X%6s cwad¨L ꇂ;lœ2oa\cr3 $\Vj׸H DxJ2<-o d"ؘt}MoZ-}g*3 >}iiGi%z i0Sإ={(;lA"D ;B(iƇ"*O$}PJ}n߅$Y%WZ}qN7(Z`6`lb \MkLz&fQn%Lna_˱0"P]T Xήtdz$TW狱![`k5"Oba+Df: <Ը#18tbJjK- 7[ );;N]%h >Q̌mn!BN&T$K~Β,5'٭i8V8{Yr<`ޠd҂eԠsr>_m\>#~jw`GUR<[,QfA% єB3XD{7˹8ԓSeլ[̢|c-@p $MoIy8P\ ҁA4NL罠8?coٯ}l5~3졿f"4Oq'"W`Up*ܨY^a} \l+3$ \d ^&#h?&hfE%@TF@,n0c,ɯ«ER`~(v_Y)!~s=1i"'I%+J1z_Dh&EeJ]Af-Гd2Mƅr _eɍp2ތ´p IX˞F(/ގd)ܷQϦdT!&SFRB-=k"ϬG- fM=_ (e?FI<;&zuudJI^du3PPomdlTs]DMqcg$BB;8ZD«VZ*!݇a_G.,fS ɋe_o>MSXd>5of~<^K\4,qiXجhf&G4"<3&)tT*OS [TT*qX"8B):5h.A)4(z䃱MdPT}ipQ e-%a`,;uVz:_~0כq,"EˮU+\yVmvi.eIQ.wef.9 ~Zyu $*l^12?gϗ#<\j_L&yߪ` Oe3R3q0Z0d*NT/~?^.8yɪ}{S$2p(v1#[e7VjBp& H`Spn3QE)GTfsdW|jVڑ}uF"{jn`vT&v ZssT6glHZ 7Œ69 nޡDΰZF uJ-q'/bD@E ~DƢ+mtpу]~}9wT覻2 ,` tcLRд-c)w̪~ǚWǛiGd%7wǦ_rݞc~6ڬØ=bU[L#ռQر $kM5vl-PFXiNeES@wgX܌m&[WAMǢmlѝPdf{>f$~ݱHk $@n@/Ht&c-)EVF9j:r8u\vjEٮw U/r2Z)ǢZbkOy{2щcQmUf%ՎV;?@cd8[EHeŶaS)nKC>?e&mwY݉ v`]}9R Ll,~ΎSıdvjHQq#q!nI?$A-wQΦS*$"K iL)' (te (&`l7@<2ˀI 2f5`SH,YLf'o#ċ_&DXtMoBq(hKo0Ke 2Rz7?QvIWH`O*~?cpdhrfqK\P`cƒij3Ie""ۥ}w~|ˠW4a'9V;Sֹ3Z(VSiEPJvyEp}{cQiA+Qjƙ fE^^!4dߏr,K+79y)(w a-(ކ7;\50?{q,'Ƃ y #Jh"X!MWoB+H3@F3W9%QvV[b^R] \ /fkL}6F^]7}-mc@DV\a u ArI^!*$AkqޑdG"N١?.SN..`KQp(|ࡡ"KiG6T/fy:7}M֌f)Ԗ)S\;ĎXGh!xLT HީgO%?Apx]鷔7SeڈN4Qdy>Ξ`|di0-m㌲]^v>ۋJRiNҏCw*{ޱJ{rJsysm߇7aohGŦvlFċȢkձ36Y6nMgl3e[iSYp>>ּhq f/`8v- ݚm֋7h3?Uf;^Ԭ-Fgα36 1;c1j؊ͤI[Z1{¥tl[@@٭ q$jEN]mcgljHɣ3vw|FՋ}uʟs-ul3L Sɻ?rZICmtG_##^gn iKM{+X2)Y%/t3H*Sa4_w&K|3fV|aBVC1|Ώ\#r3")t86*Qeo^[Mj_=UV+qʨc;[ѮCړDRZ59!s`J0&Cd[DqZ%KCDJlyϳrz`SA3XLMݡTѥ۞4\K#<̱BM[c>xQwmEdѝ7>\% CWa6J;ƌ*]MHzE~Mjf++-\!"r,sc &F=-U@T!;*!H-Vܿo Ć[7FԞCs ~+:"UvbRI]CU>.f1c9;|:yaq+q Dz^'wc;cE"PҊ;lCO)]A p$0yT\Omru*i^ky{X~[y/;<^N+RR7faɩ6 W bkU*4:Z,f5lrL8\^NsD]Io#y+FNɡߪ܌؇H $n"9~}Ivլ& >i]SY#Ş|it +yOo F,_^>yѰ*ֽk{V[̗k_Π"qzYyV̐bkRaMEBr9ItֈǐuZD dw!C*Yt@W}+[?!)R>N:z[xFI(B15̳b@ ^ )Fz؛<L1$ r0!GL{k$ljzW)aT$tK}Tp #7)άfЀ=K A 7o<I`9Ԍu(2H}1(Cp£1ӃyC 1_BT` CA VUn_ZM#;O>嗪|J@6ꐡ40Ie\}]X9H( 1,A R!٭%Q:B&򢇼g \ӋTœwYV7/0H*!UB'@.ߐ I~4GldL{QeyZ/}I_7\hseP_^4 tN,UtU6]LAwҒuQhZq&DTφĽT(e1kw[M; F~ bbBm<Ks!.O|8umTVGKUlԃEow@ qXgXH)QnjG 22BnxrGW>[k758WPWݽ+ChɩJĸt)h/jS ٍoɧM AO Hog1. Ux[u%7U"B!p 8kEko-,$ h mig{JEҥD:Rk )tTԢ,ĸtXp(VA E-c xlC[J~ɚSBvG`?Bp}4hT^NqPXKX|Gxv3y-[|~7-iX~/{kJioma~1M< Yx. ҊK$6,Y!@w$ZVm9s/GlxFTR\㬨HI'GAW&KBS}? M^#.?1۫a#8DZ>iw~.wz^H)7x>4[iR4CX敽<{kHEEEaqlev" tD%$SƑH"c@Q(˅F4/D>BX;M|0c.Lːku!^1Up5b*{%2`!JpfuѶ D]kW5PwNʻQYNc!0 xP/QhOnHnYdD0ˌ{}GȁY \m{^W?$Y&L6#V!I))A0J B~<ɰ&: Yy0Ï:GGNl۳#)6?H9w/I2e>1HW@eS8nQ+($|S;5cN[t*8H:B ]0"0 /p:B:whDnOHd${YcʺmvNhN5FԒua/>Fg$& (> zQI.$T>}HjI)"Ж_l4G!~P;[r@rrtXm7X u`M#g7U.5!y{*UWT g#+r 81KJyژ#\79w3u^mr:7UĘ?zs畤 ;ז# z.z|Mkn1&5 4MjȄE$".tڜe7"xHfi ()+k*g\0.H+^~Ǯ'kH0B@̽feysvbCx>*ҍskr-t0SC϶83;[oXG+=êP6uhN r~/]Hf@PB@uM`Pp|hnJks9h˳ۮRA8$ ol"P<7q y(Vb&U'巯O5덁U]],^Ĥ!+f˧rdyyf_rcZ _*})Ҟ_(+IJ@ ֬|rJ x!瀂@lFV买7?4XÍބsb;l09ֱ/мlj1 Ci!hI09)od bV~ɪ\75iGP;ъwOG2Z)@ !*G`@C)o J2vX8}Y&$pD_ڋ|4x\6ڷ/?'( z[.hJ4gbFp砊/ߜËz0V zf^PIJ0Z§\+ + $0>?"{ZqkqTM['a_-{Â.@qkz^(9\Ҙ8[gy#B.U.9+H|Zr9 #;z6i,jWHz~ L.}>պhL겵9WMxT(/0N8KT+O>n!>0-{Q, ɴ_ 2h:]S }L'X# $6~ZAZ_Dr9_z!t:dO^kܿwQ$FZN8'P1»i\ˀk5@uI!Qkkbk6g*RrД ?EV&ǦZ_,ַ&њP؈A.o=bT$$Lv2S Q**ԱM6 mR (,kⷵ=w9VTHS] S14XYHf:#%7UބeV:ֺs݃ -?1ڽاbrkLتԺz],?3wϼd+LDvӕ&PhߪDZiK<^A6 tY;S!z&G viTneqDsr>^FB#[9EڑY^hL#q[S2tGG d%*&mk^!" Z vۄPU GϏ ˪m f};:zFo$o[AcbY۴F)+QT[+b%XQ`)X9s/}~M9A9wVʞj\vIeXc#U<8y&blj]XFRszb>yU\ݏQJ8s=ڲF#gqw= 8Y6zfؠck9j3"<:ҷ>; :P&r.ċt''ر1J3"H"db*{PS]-j`O-5-> \޼?VtewqJpba}:Aر8уhY$ZG'?8=2?=o$XΎ9%*|ۯ4씰ebȖ3D(5`+ a?d2e}>uQ 0(}IՊfa݃I9RIF'p Jz<@F: ߽JB@F$)FQkWAChx9`q+,Z_hV/>v8m \t?_p oMȶAAߝY;]EN^j6  ghkǚ#%͉q\SǍ׬:3pgT2)SO4v>Nc-ɮہEAݧ8aZ:aHmԩ`Yf_YJN0z XpVPU}a׭q.J3xB[Z`~,)*ɵт(AyuwQ@09qg ̝t#P'Ez<]A;24'ȣ冓rr57)v2M3?3&rVHG\qZfgLh_0YRZm #u^U򭁖BaEꃚkntd!4VK7֣,)b#OP*3Q=Mp1`;ָBKVK͋-~N&:ʽqyby&ɓN-!vl (:*p[1I?h1VSn"C:7~GrB:Qkmja輸[qa8:uM+]c# o PC;}UknD535x+wZF JլtL$gJˑ",M]y dgC&@v4)˒|߷x2H>ؖEVyR2h b*(HP2=eVMrp!xZwsq~-&Om -i_I*B^Ccl>o!r7c|دhDJ,5/$n˭M9;HNVS 0 VEd7ٍd|B^*[pDvGSo\D߻E1ߚv@1h4)D?d`6H0iZ]d^g|Ɗ0L?J}tzg%`4:VUd8l$6bo.NjwD7TzA=~~@6Q y)aoMa\qu >_JhWC ;1gPȴW}B@jOR7~N Q1Mf45ZI8) l}, ^s{86CJ ~.":Pf S4IFbPzqw 3"xXC#e}z 4t;Ãi= RL%z 3)' ɛQ2{(`+e]j$BjT\zMQQ9*⅟44|cV.u:9^~#}ȟ:apQ5VN߼e [;ꢪ/zQ`8lh!à9>g*q$\YSiI\kt:4~^ѱ5k(>L Zcfixhx W .)۳okwD&*˄,Ho^eBh][;Uq˕,\&9pA^ㆨ*$?SX=ؑ㏅_.ЀvAYd3EvrCr`xj80A@F?3GgPBow>{J pw LiKǑ4U2繗Dchdzs!`D@z )3xkSb4w R=9Yŧ&p4[%h0ŭ!T &?9> OƔ3Fn'bQ562vPJxϽ,)cX"216٫3 3\H99%XFҸRe_U7`/~SF z8Y:͓qN^Ij/-ۧq+JdfKz7(X@|̉A>X"bA4!D잽Z.:4<[HynRFoשM ,fЉ-Vɕe6&|X`6鸜%e.NnmixǪk6v9Hluj B&*'755uw:[&&g] N3 k+ MAeJ5QتZ9nNW54DEp$:\q]~?5]+U?7R"U18ÚflNK^ R_z9P5"yQ#EaX};퓗h6[sϞ+1o[ע-zMh.6t 1Br1iG, V \!m6ĨX#J9YauDn 3#Ŵ99NasnM1 =^A۲듓r,ǘo,E% >m5&Y,1V5Uv qK=s87evT֤nTZkKaTҜ{^NlI1T͠-wc>1rhL ђ/mt-յ컭[2FDK i/QWSy#n9쭨uTXţO%gdaɰh5 {P5'2Sm^j7V]<1XbRaPҞ6E̾ߘIfEk*7ziђu0:p«mB1^^Nq54n:" 8!sYC]HUZEWKX\%^.srISGilκ& +:WoP%B\ӫK.ON  Rwˠ. ]v\ա{LnWt@-BD^J ZC0fD A!:օwFƌV+'Na.*0`u׮  #ujI؁wIteՂewIU[Am-tʒa4pRj]"DJrGY^0Цs~$:k3CL .U,".)ص{"(dK6[x̚KkYVѫP#@#MW#4jP 41;nt%%=Q$`u5/]B!_X4,:]iQ8ҕ ,0hZ?*t}8mX[ebB# dYW³EKЖJ\U~/Suۂ`Ӗ |6k EݫT>Cl [9~qGܭq#!&oN7vQy"-_hp*8h5lOg n)du#gj7m#ٮw0I槂W~霰Q5r5k8ΥR 8q*`Q:")_D^B)cWd8L$Pz]aS?ʔ)%(U3Q՗j+&I&Y0d8O `}?Tmm0w !VM"F"~<>6w畚fU"dT! a*!䤥1 œl[OzcDJ!#lX0mx{{ԒÙvX3Ra8}`R*Bd2 "*IUb\I&Z{} 4˦\8(=&%IA*'h!RP$ǁ?h1c naKMz1_#Y"]pÒ &ZE#xo"M:u q–ٳlU=f_*Qn|fv̄r.D[gk-SI6ab,,${&:xc$1R<056Iӣb!՛؀=q.ʴA <씵$?ߓ#ӄ;`,5Wax̝j.Q 5K bhTH%[s 0qDu눂ETk2=f >yzR'4XS /ƿ 6%P|˟b[a4afb0\uқ'<.%^ZOcH*[Ko2.sRmwgq}'IV IM%ؤ̞2=nsqXT=ovɄ+2͸\tC:Cޥ~9`/2Uӱh'Q$β %B.9AZg-X_ c;VH3 qUVj]h-oz`etd^kpq# JUrGT(rܝeMcd#,4J^R4'1;&sh>JL-]!&|8Tm ܱe3-iibkJc@eHl$kw0Qw]L}w]Àbo8&GI:%R402A/|v7(ٲeiA-+6:]dERjhRVhVv[ɬ|UԨ^T?_xT_i~6:vd޵\BedkTy NyXdJ)&iIv(RjJlnjH{Su\sN˷IxAf?FC~D^ݍb q ̑ ̑j$xoQVĂwZMc Ffc8vRX%)uщ#G 49fPXB^zH%P M NtZ%܍(~kSH)W+a!p52HN}K "rJ鄦c=mϢ]㴑 ˙.xV8mBJnL?v6WY6ImJЍX]nzZ(93OM0k9-ݔ5F襼M]͔WӅ*3GРr/{pHk^͞мښ}֯vx:|CQ\;aꍦ@5xr ?U"أY%dXMT=뷱w_9dbqO?Xmr Oϯ{AŸ}VVWl6z/A-vbrmP]%/ˏ1XُA?0៕>\foّ>p|(8)[I,͂UiKGwO%lDRW"h\2)dv1,|8C<#-^N ̺^7UIoAp5,M8 ͞[U*ޏG6 ֒ PQU( ^HD}\*'}sҗ9'}sҗ՜t5(lʳ5XPa ΰ`1+b$akCcsmZۂ^u6:>_lϫFL4! iobx]Xו|.4 3K:ko@guua;'{: Vkd%a/qIX *}x^ iMմ+(5p-Cs׎sw%MtEcn#LNF"xdV`T]: 20ƫl>Vi4qm>󇛊J75OB0{c:ރb쯚9"a0 `!ghXb܃3{3:]L 5J3.%w[k[_]?E JU*~2qnR.@K!]"B%)"V"K'"^bїOK=A ZbbjAM=:كRBSJ Ju8V_JdJ9tZ%7` E3\|աT\~^t%:r38 wNA3 _"Lp9NJ37ВO1Z<¹s*k[ZI;60(;NZW#E&4Ʌ$w}'J6IUE`JJAIǢrI'lH~j?B KG)$u:-1 WѠ, ѤU>-:_BRuu{gx&Q7YX5QM.sDDYЉ&m,OpfAHɍ9KRZ%Ynk$%WYh *|,puQKZrWȨW+YMx{Vjg@6x.>}Kж];W4j&Gf66H&T`ByꜤ'|E 2U pޜ`:D& '2fƲ~Xbdeb=l8twR/ڸ Wj!%ŇHP /C*%kA-P]4UWuΣS\?6d8OmJ}9'ȠRQҗjsQBRPTP|X6ϫYE~7<}byp`f ep[וyvݷxmWD~Wiˬn*Zuy!ex |Īge|{6_͗՞1&9kK0弍;oR`8zb[8`/D[Vʫ5Go}C_З;e;tuw$$ /OuR(mu3 SL=&A#  sHV;m mX`.s?;<ͩD82fIN%2 ,ca&cgI6JwG[V-&3Zt}aÓ5WdSc.66>8o?=ϳB~r9#͍L1+vZSSCD5+rbv9vK<}QNjZizo+v+)*lbEaJ$&$@C#nUBHD8gKsڳ=4G'Q  툥=vr[ڱ*qR8s90+пD6__{wڙM@ Jrz:cɵB,3T@F$S;LC;;-+\q$ϫs'~;a2֫05c`14|Nw&!Qb߭E\WK4K>cMآKLsz}YWSH9Ih:TӔ%OU#1\??9 ϒ-/}.L9.bd1%VnnIILRw \ .:FIvEtspz }g=\h]R kuLzt{FfʂVt|l]Ҙؔ8EFE=jҠCH)8l#@fmwRvl.JQ esfB~Q6$}7n[q!nÝÔw3j0NR BNA`T,*u:8"u|Fuh`k0Ή(13FQ q\#bǡ2cIGm9fH{p2 T΍"0]|K7=Aʜ-tijb/>{fӻ7Qx^11E2w=`b`D`.zp z4zlf9:- "m9Q7l-3v<#zL+]k;cÃC%jd,(oNYSij s|bswlє=ؽgK DMb% R|TԌc£6O-bB((GPu/(yo~EyNy@ y 1 (aX@yr_?Wu_H/qr7ﺌV*5?^=ǁ}/ )#di):P lC aK,<)؎S+oLkt7rzT&؅NFu&˨;8 7Y)v."~'L V%C%E;:W%ý$WrV2ԝ3XLCa>Y˯, 4YZ෣J:wi3,Ԫ}]dX\L|m9nE-;M{M7߀ P eo^(;pFXEKI{Fqo0IN oøbMƴ\(Eu~5gW0*Ml;P.}I= m>A6[[e?{ƑAnUe@q.8 I!觥XZd;߯EC͌6ER3U]_u=6S*}QNz>ahҩ\a=^ou1}qe x7Z*9v&gZۗ~v-GOXVĘaN` Pرeq<+[,v򧯮h X0[t&pטY?YX]+Sߒ %1 -bvΧ6c: `N cz?ۢkb'p/ZW>\>MekOb5/9ۙ3rr) nr. 8: +~(Q@RqSЊ55yZ`'dk0lz5mN*-ٴMI2+i!q+ayRwq[!#Dio~駓g6 տN ϾRDynbh|j=%*bу;͇L:9r5j(ZѺ\CVi<j*0B)qm0Rٜz97O4{gXف@ ۰hSiO!.7㥴ނJVBdTuųob|ۯy8xc 3N(`%Qm,-<KhՔ0j:@w'*S:m`wQY8! koPEjdo*݆! fȫ3HJam?~v'__{:wvč{})^^_n.2n12&(F)+-j?w2sx g^z";vhx{pM2E2ll8K1 RRReF+?&GM8S{QRXEAHtձU^:EV8suIJ(@d! \/ ;V(Xw+!5y^toceVRbk~@nG-Y Ƙdz^KTUcW6qA+Zå;EY ,)$VYh.Zb`EU*e+)Ja(CRVҨ ֨t 4꒽K՝(HGϘ H)2@KIXzZ3-_"A,Q͊J(SWy +^\JlG\"B W'aSx!DХ1IhSX2Q-ziF]b&vt(`\}ʚy߁z  &D;gx|ò!ur*`J+]8cR^zae- %J)ԁ+2,C?¶o*p(veMLXfcc|8:I}Lk0 +FF@>D6+ˆZP=qRYCԦ~dIls"%3J"Ɋ$&$l)tF ;ʆ~p67*\%K'Si38t'H"E#^8IFJ 5wd :]رd䑤#$ hͭ׭GLD4++$-l%EFk9N la [QBJ bPɲ9$-rTd\ى2m"}HjrpEulWh,b͵ ӧ #.5tc3i^*אDfmwDY@ kH&JF[AWLL5kH(nO:iZb !B8m]VL(BZbTw6 ˶`'|9yI%#Jp #ťS! [wFd;OIPBQaky,mHlE4xǃߗ~ܩ5k+bVykRt3f/ѥtU=?;bǼ`vXox$ozW}wYn"vV>I/19|;ciii߇JXJQ~^R]9Uwq p;7<wWK{řcy!7g~Søfm?^UW~N9<9KY jIX D#B,k"dGHZ$sJB+E0JPRQ9aT g`%>\|nǾE0w˃!8::5l0t"]mz}%<_Bޱ)NQ{RY =J'a3E"&}vFѰ'+ +JUHo]&g͎ )L4 Y>HbS%vH{dZ؄d0#6e o jP`ϪbΗ/Y),#(2k'%džH/ZV8#{A/x"6}P gŒ!.M\J6Xh`(6dI1((μUY I,xPZ:5aap J?Q.) p0vu9(do:^B! ltz F{mkk΅N2cb:U8փPcRTw>YE 0VL[ m:zoњ5]B%t1RQG"eb%Is}h=fnnp9do4MZ455HU6.薛N際1ll #X%'(0B{U=-9WHs5hEJɋ& 2I$YLJ.b/+;dqRrfҼ8yiUdM4 e'MXNXm BV!JŤrA{+) fD.J)Wq2 x?h l|TaZN~o-h"Ch9QE!A-lYIggY8DRNN} vv-|Yɽ+pF=k1")p-ܬ,\pqw,:5ۺ@'Jpw՜Ýpg~sH8V| ׬MĦ\||PǑ0?,m+oSk8'sL4&C:#C^̎asu;m|(=B7ħ05W+kJm`5M8q‰kv&! #UA.+:@@8A rmV}ƚִ{C.)r3/82[ hlI>\/ۢ4dMlxaD\t&"ا1$ŏI94a@i)W޹V].s~חޗ|`7~P?Ʈ>n}2W*olglwECFegUGG,'lvw|_W]􊐿y+.iEJ՚4SM5 V= g[A >.ev2f]ikvO@QR#74NlG~]Gcdl׿~ubjuaFц7\sMh{ Co1 ӷʯ B^+sE׿q5J?Fuo)u ]Y~gPK=+e%^ ,~s]?~u;U,:,nɖr= a/4P@2&ے [t%#oǞ 2a"~l'^R2jmU1O=b /)b:cu4 FЂtc0}:\cR+֌6@R]uT4J&w32@^xպ)liR]F%Rk=>ӽy H7@z!|ۙ!&>"PXpߴW|t†N ^Z ޶Aa}KmEvpC-*XQO{][d]{kn<>#DaVqwc9N<[\cbhTqVۤM8,OjU唂&R*duZI-Ma߂B'R@ ]6K"gUno(ץ= bΐ3H/MCVDGTM1o&jTj;rKC. g-5Ƶ qGT"5޴g;;Q'D,- g_#DU&Cls|KB. KnAP*KP @6nѴk]D~]Bw0{wܵ Lwfݻ;4*o%X s̊~a' qMz3ހ`$ס_v"KR+$ 7mAB,[IwZJ! FƫBji-5t#Bd6V'FfDf=nxB5C$ӡL!wL)l\L³<ZbtTt]-4}NLP.(xo-Tc ix&Gk dZ1 +>1=U[ER*v{mfi~dϑ5NH&Z%&>7VC@ЃH]Muǀqkk|PoIyC$7Z=#]j_1=m7@#{C D@b5y^YjKf\lFArHl 7gku]Yc3:,plюBb|k"|Lu}7q :㘎E#Ɋ@Ȟ18|ł >El{gri,)9SM {m[5ju'/ٻ0z֯ɗ8Vktfȳ)0bx{]}#ȉg y[PaiPim/>Ѭ[ v";A_c||vjܻ_T3F ݑ q≮Gfk v;꩗q z:$k6t ۂ,5[ -1?Z6OX5B0Rq@%`!_,ŇUA" XCa>Q?&e!/@b8ɗMϯ¥O*r)z RtfZ:뤁[\s!jP1uG2cM`0QwaI!DfEĺ)(uCVAvE Y- ,$׆EcihtJjF(L`!P&П3dcvUI.1hp]$'U?$O+௾-KyЀZRStRKRZB9KNM\lզ6z fHE"Y/Gh# \-> jlR<ĨUɤ)%7U|h@"_c fFmacY" bn7 %]zm")9Çyڒo !2Ām{fe{1}("VÕmUW !Y2ktBKNeB~ ZHX0 Gն.n'zBŸk~x;]lu5q:ს5ݎ5׵o,n׾=в+^]gu7}C1%_J@/G" H(6<s-Qj,gwTƑw%ܛΥ4]Pfh)V ЖAJت^QE fzs*ŧa"~%VULط``CIq!!1 {h(#rӕhE@(o )|oepJ>S2j0O_bLk#)A0VhI*Yu έ% 4 V9!5듑8j"di]J#|ףTàRMV˱*`Y}ulդ%i`Y)B;3r<<|0H?Aϟ˷` J y6R'ZZ/Ǟ&3IJAYoGlt烂qFlpqޡb\@]y6*t8Gw꼎VAC3b`Wd+80o% eɂ .ѓ2a[Xϝjr+Yxq% G>2<ȷ%Shdm,;NFm1–W}|V򪷡W+PA:lyW=¾bAU+T]<=C&N:)˗uEр ~.8Ôr #SV2jglp4 ݨ]r= o<9w(i5LVVS; +ٵ ,J/kSOZYZ9vkN03 MZG8giO; Dw y&TPwwu=}q♞Ĩ^>Q6>U Kh:PhX>|w fd];#U: .ZGC RLA3!8:vw|3u{NA(&RmJaKilQ"=N kTd4}AQţdMF# tϰHS~-b$ygzUvOq+ܫ,7߹#ɒSɇ6|󗒗H6[ݱ5Fqˈ/v_ڷy~+؀蝦#}3C4?VVd} "XE,o6=%W/w`S[!~;;-/)-( P@ݯ0 tނ-(o"c=ٻ=2a3[>RF/ :cF:q[-!֥w­KI#l֯SRKod͌cXRƞ/̔Ms~~vHvۊ ƈK|/~j:JENU֐Bp o}|}SϏO=ɟNq6Oamgn`-о}N5 !Eby܈KB(\i**ds֥rUk͔CIboQ1IjkPT<+a+9s$ }2LP م®B9W;NTQ*fk|(H;;%bp6&e"Rb'ש􄬝ű*`o E#ÈER-5s*>ML ?H{og&xZ}ewX!wYk.#^˹2/Erq^ kr-qoYz ^J=Ǚ$B#)zq6-C08ځi֪t61 n)>os'\S?Sv6r:b%GQ-* ꜹ#$fAHw+XJc)ǼX;cNƋ%OAp@#'ù높 s$5[]@uUQw`UaLS,'R~Y33*,%cb+0PCc&c}\7#NF4 1a|O疌JnL:WA6K%mQ%?R<.&b䫸rz SOG← ^Qħ{nlvHV?t_ųεq ;2r2Kة&^N7˷?8T]4yR09aF\M=h!5EZۯE0!AKof A!t3i[9O^īoۡ g@U 3?)\_o:ww~fZ ~M.DF\!5;z)|I!pV-ɞS&Jňaڦ43{d1FO9Z(H 3)r R2ϐ킔ZnEdk5r($RHDG?~ۏWg FܚY2nXk"Sv0L䜲o4둗}Q',XSDlFC8ȳKK"dHб$ F?5J /u?_6u;d{ƠA:`,#ÊHӕ@t%E@t"kEf v&v {SrńP1?0ֻ@6/3$=& w;MyZ<=QCLa $UHrW-`{g"]d#`61p1L" 5>f hh%3iKT!f\;Lzdh/N~|T;͐b"B%Qi,n 4' O#c]|7qޣfw*՗ɄR%]ݓ ֒qg(qjfs-X*1Oa-ΜJ3|zc]o"*GޅEnfiPt-v烠3 N~"鮆&DY@iDKMJU[U-MrQF8Prc<)jv$?߮3#:Q`k-|͓(<¼q2LF;FS}:؃yKƇ屢Dw"_{rf~f2!K|BZpوI(KN/8Ot;Fڌ3sQ`Y ܯKC O4 Qm:BV`t--=C0@5wFxlewqlUYЈ~WH=\FP}eZjًٶsbSďM pXiƶ֐_Bɍrrǽ۽1qrFrv b8'ijԈ{&1;1&sj9K O2f#I*ړ OXj~G(U a}yPtJ{^ܲ77D>oC/^q}}ovT><}?Lc'm0aRqqO[j 쫠zAޠ{WfbۯiԖ Nb:0qgξ*f^@{"XGl`san f2>C=C-;2.ͩ)@pg?~ (;YafǙm tMfR<fuGm65AQnOuAnSB;/ϙ]/Gqwts98, XH:8h;zJu`mjgs½HR^4.' كIgI#&јJSRht4SުBLB fޤVuHTa.Pw>YGSުa3v?[Wfqy@P>‰\~q÷14\*ocpFxH^>cgc~x~5\`s69Y:/z-b>㿮oɯ얻۩/~#K;7EiӅaK1/.?}ג̭cIw|KM&uQ80H@YKioi7ґ`̓eA,HaTEz1ŇSdD&h#[>ؘE Q%05nB}d!L%z51L9b>t&מs?(!F+fy?߯_7Oj|Ҿ)7[^rUܪ᷷m,2_)܉5woj?eo91Ꭸ!'P4~2M`-t떎QY@\#X;ڼ28m>Ff*{-e2TLp~θ]9G_(9Qܗ>D)cxֶvOq<{*:/>; O U?Hֺ'm#IW-v;3 c2myZlSܶY"Y"EIe*G\بRIQܬ)>LpҌG&TZԄJKxq>F*S M*f[Ho(=X*X.kҺ MiM˴j|U{ kyA*}EXs#W#1$Wu5^C.WSQׄ51G*lde F`fkv5 Ϩt{kv^fĔ= ^O)T_~tzzߍtߋUޢΐwy)mv6唞y{}7y~lg(:Yd2sV_&~xot;{3(w>y.%t1*0h$8_D%xKD*Ә?Y&>Bʥ$?^ū%ʗ_FN&OWx)\"?KQ%jsցT%Np2 PSԠ02˻.^Z!jՄ"WEiiaֻzU|ClQ*q}LJUjv-g(.}PN5E}{'|" I$oVib]u>)åU+(CDwQ3{ЕLZ~atLӟ4H, lAG8W3hoYAuޡ6(CωB֢8o?.f c &JJx^g}.M $2 !)BgNi`W1`ZLՑ1&0-qo狿jL9nGŶ)WLWJ甚+PeII=,Fث|- Kfͬ"ͣ8E"aͩ؍;[刹ٛSh:ե_~\>rf f8sp"*oGlG@$RK[ѵ1PnpB̽` T`Z}bO+{ʻAGu+'fB&b†lU+uAϠZ0Q{cHfԘk*)yT4郐8 |R ǓAG04 IpY#P#Pvd:0mt^B.!\1h jb B)*퀪-]&l3.oqY@[T$\Opwŗ o֛>s")8SP b#M {;$Kꠜ13m*QT9E [ƄNulV"A8> ZbxYkp"q8pÅ^1El f "* ,IqC ǁĈWf 〒 #j2wL6'D׊ 'slłOgpfB^'4:oRh$61}EP|)/mfù>`e4/]yON)`#oHLlQBY`IYoUj VdsI,hၥHS*9d>2bD)2\Au"rEŨ^'SPȻq-5 O'jU2Щ`/UA]4Qګ24+'⌢/kpj8zfUH̉fP-Jp%Y'7(E(' 0 m6쀡* j5$e}O2R 6QJ9୉$dg-CDJF~XQ䐜A^u%av4CWHkB][\WyFnŖ5Qh6|3[ق;C)}JY?&f)]8fp] ^$.b?FI}|]D6i{@v71sƥL@T NSxH(#R9?7b7ڻ̿%Mf @=7W4eex-iF mx- `|ƚ9Qa'64cr隱Vky8kȆn0#*VǬ?/mR>O9ݵ/8l{=/;_ZtF(tBxg.s~o ϫe)KA5Ta'G`2='ĩZ;r*LJ ? \W[0C.oi~3Y}G\rRpݛ(AQuxR0.  R.%64d1ƨH#6rWx&"1xtHB!lÌCf%2ke"y ":PQ 1(e-cƜcCH$)@y2B\UڄdFsΣҀSG F%qoE:x1Yǚa|(u2[K8ZwFiZ+m+]zYs͎şi^C\LŻ%ӯ/3`<&Uk{K u&͍8rJ0EB$\( M'*u bw/aŘg@-ؚxJ%^D휦d͜1J(22O/2)Hd%) 9v7k B; a+_8[Ϛp L(VY+KBtIJJ@3BRy&Ԇ=Τs>9''g4 B+ciw6/*PsZQӚ-h*fkN; -Tiȳ#*݄ !$WbdJ4;MxDtR&HwR]7CmZXV) ƠX 4\8$T!ZJ欳EDj `A{^kMmvS10a%XECw hW /k ā,c /\F c4)$d) ـJ(uTQ=HSe !I'H$iS"NUwC;g1ŏT J8eZ ]R5O<5+pc! e"(#,ۨJ7xa>h[Ne(-.uo*43ꪐ M1 a{kV4՗yʛ9{[ŀgSIC559U*%4ic͜ oǙ(E1Ua9vie4,4xL05o/p.K_hFG;>|> ^s'\=yPj 2 +F8ԗr9Wfaon}JKb":$kb.gh!L%TFfao7-dpJJZ;M*f+ApKf16ng? {_G!]F`יnG 38a >_v WIt`VKf+\#_!9~ũ4|i0]rTlWEnR/z Jd h|VgvH@Kp ~o3_GC\mFۯsZoLqY,C=7fumtڍPZ ]La"0PA87k{(F?sCMӅ^R@hP˟rF^swa2:9Ⱥ)yi'۵mo<*Kgڝxi 7iB}X Ev.Y+ eh*vn0lG!\${w$'ӹ 59Ʌ*ExEDlN̺2o誘Fmk6CD 593UU;l{(nuSD\S/@7xb$%+ Q#7PV'(DUHӜ/|i8/u=+L ?li[eڈXyv"TNߕ >x&Gf*h gxlz)P M3Gf0o纘֛Z'Ξ7vg%"A q%bΞƦgJ Zv@W fw]>ELc6lU+woi?">-6CGT_݀~'\5˾{; K=jǽr|y߄'${QO}_ʁO; c-@4\<+arʙ a3G5#2SM ;  M:O'T5@aԐ-]coaۤv?:_cJajn9NSrAh‚%W #rfz3b[Y`*;ϲAlI9`<ܖa0NrsP+)ҷnQі8t^OMrMnŐպ7Tnv.A#O7#tR]-iGh,G[IRv򈍩cn¹h QJQʙ2=i\1ͳ>mxK^E$K +,}=JzNsrXa(ˏN+;R5=j%^Xh|ژC&N)ax@D+#s|+> .('JQEqjyj,{ιV?})W}iKc?UO ~#vػW8I+!PƱU~]L%_m΀{ȲDsWV2WV8~ G""i\)%T9E,r'_eG*"֦Y[t9%4XTCqr O!WXs;5\v 9|th#i~Wv|<_795O)2:HP&QJ2II@^亪Um$nͪ }V_FNJ 9!'?*^w^A O"J$ C>5JkB9!6 Rpbhoct6^+ [:#\{#Yi%%Gf{JP+|>Dc rԞ{!C\QY0kyY(F{@Llڭ,wNpqdA΅8", &,aZ@#_ Gu ^ m{ ]0; PJGۘV%_z#jIk=\/M*Sﯷeի.8gMҫFeG!Z w3Fq 8qbg/Y/~<,<&Z[ +Eǵ4K/3G7Y#)p0ez1޿5N'S\+kE'ꊭ'+Bt'bmގb} twB~W(vnG< JQ"\aLqeA qRu5 `vsW BzJ!JTecӝitj"ftP׽N:˫|G Wr|yѻt l-c:Ygէѣ \cKE1EHitc_1B-_^^Gzם8{VLZ"PuZ7UT g?Ε)wX8*b[{ɐ] !n _h €$߀<$)Y h]yỪx`WO2Fk进W:Ļz=PĚCOF7czh Uk\Xڛu!mT!ěq> 9뛫Gݻf*8}Gsu)~r6 E6 ;A%>C{I1A tHOq1/n@=|-w.ZlCGA.{pI Nt<_q_]Z^u\J?Tm'=vf(o\VO߾zm&15МH8g6XAPO<%I9v ,'G]lN:c _*c褊snXݰ Zp'OQDWX{JP L|>o@JK%kfu6}0g=4)CkrIDKtLrz,>׻GFʎzӈc Lћ>Ao=HW1c-\mսdqI\3x3JݞF)gHIAir,M 6P[Fo`[Bt&6I @Vc"黯~{?7m+Rߧ*-`z?ҍXGGG]Pɞ|2_,d3-V?.M8^cZ-(NxS \H`J+Q`X~|>tvX˹ZZ&7gD-g4emF|9id$G }%, (,#Dz^ڙzyY*>Keg^]6e(-erBE2(ʹԥqY.I`4:MB`S[wN]|(P˥iZ9Ve4,l2;K F8)֥Pݽyz&?S ozEi:4%o}3CIש _[}q#ɿ"g~7,lYd0npHٴ%${f27II^VS$%9 (]U]]ZfMgp' ëWg+N: dQhE2eABMD 񯜃qi (ߟBK ɚ30U4oM P2[,]Jқ՚+P5XmT)M1"0EMmhl1%}>vϳzV|:H`nL'QId $f<^4 d73q3#^bs'I@ OYj9»`B;4m#KIr6^0S 3PSw pujm뱢JK~Gk VJ/'t h" ^\ҿ/0 %7t*MM49mUc2=kvFH~ RF]eu1)E_zoޓćBܱ&Dl F4 $R'9s(Q C&ڦ,QfȚ5s쭻Am<ڼL_r%5w2q]a%hAm qLU2a]s|HIiTY2I Ɣ[Ddzxx9DhD NF&I!R-c<"9t\cGrY6=()XO)=Bj"ӱ>-0YD~ wrWÌVgɡT5^8S5,D˶T#hRl> #N9x&LnHMI>nA}ݿAS5.xst|WZvZ=:̗x%iSgNV6a J c5 cK!߷sKϟ~Kƺh}`z!0#\"V<匬|YD1L4&dH$I8jSfiOSyb\ȪOdƤDiƱpHXUfYoxiR7g0Q cQwIp{zś*Bh*usTQJݏsQ"JIm!e  1\1P5ٌ C(B3ʟJ8Iy*߬P>:.EFx0^PFW,cu@,b, 3!UOmLv|fUãI%W%`c9du1SUd .)+z plIBjpVZSAReː\{@ HiWo`^n9"j /AQuNcrSE Fͧ.?kR>-a3&)L %T" 3$ XwbJ cg16HN`,3&D,\)vB1@lNZA#~A.!;~7$|:+BT@wU /u+^[㛼?m/G:wCT B1lꨲ1˛:cOhyjc9kS$"L9E$uz*%lj4cIY޿ Ms<.#[֔nPF(BcAlmƢ( ^5TĊ,DbplmjB(%lђ*cϒ?+7%_e7*C\z`] P*37?qpBzp}3&F;^g7vΠβ6u" ,b\tUB {af;Pc-sDF\4 w^`n7= ?LWdN<2]pwY+q /dJQJuFSP* h5qSVk=!Q|?gSwY)ԓvJHbEZD1}Vԕ| vWZ H^$UBʅf/%$R q^xf3*9E>`h8n;PY?8DT_D[ԚWO YVfL{Y"H&$Sc=%LY[Iy}lGs;!DN/B̴'cgދ(l%"%H!ՑcD ռgڞ 7.y27.yl(k8͘ҊaLKz N%Ƨ"g"pf+IhJzF J &\w)^t½/V; &SS, Tb8@x^Ps:rsv Ƒ HSNΤUaSa=L!TiyTP+d} m@OS(Y&*E$q4Fd3 [ļ:שIIANcZbĢKr.ur NDYߖ3'bxbڅ֒'*@ҍZPAB3R)cJ-,Xn@w[ dPFe$%Ta1O˄])8YȖ'µX7Zisoee/5*<$S$),ݨDOih/RF_&(ڋY8Ok{c l:X8KCV gPU",N J q5;铳?G 0<RC~}bʑG?Ot|Vό9]* aC`Ǝ9p8jmd%VfRa ꪱk Pы-JI۫״MOKĶq[J}@5֡׽bOdN QSg>4)OL{㘷NKޞWc޾=y]^ MT)O =  y["4Ԫ]L?r/+y3G>ﳋI/5y'kL\r4 N.DJ"V?BkHu zd$Q.ǘKCYn1F BfJȳpo|^a:SG.w b:.,9GOVV8vye@ !ZOPA9qL;C\HCM-%qe)ˌD( #-Q"V}M܊*/g96k={yZ`AWiŅHv\oY5%Nr/)/bO%+/A1ϖmQ(T =J:W`ǀz rxk wyI$I7ߘ8PAQYh'o߂0YjxBX1ZF ‘ nw;owWaۖ՟w|Omwpg ) .[Ѷ|Ѷ?6b򩿿>*I) V"-oL;L+mXoeuAlXB6״Đ Vϐz830lÙUUWWWWFn6Aק,P~}Z._ޤ2$,_6<<;O(Gvw,˦֎kIÜ׵w|wRm׺qQLhc1jז pύn^i|0铥-`xM^8׳y^LolI.A;=^Zji-7 7:a*l L~F)]][Q;@Fߌ?ױmC2V59Z8Y9݁ӎ^DIH@?}.UM+Iv0ÅC>{sD1霩9?15Le]M,oy!4yc>˳Ɩg g:pRvm={Ҭ!6K7rt0^_W&d=7+Sy9,%{+$.mqW k6lbvD&3 )7 gA~XiU-<ַH,bi?vn{N=6Dp!.h-٬hCIU`y?N;A/邽%Jk ^̩9{M9=s+ȣ0yZoL1zj@~\t' 3X5K7qzB=6AtOCѼ/`qp(+q8{6akrIZmn@:z] AHRD(B)~AgʮȼҦ9FEH }u09-bs¹s$1JAb LR9<kт8y47aBZ=ɾ by,Lc,H E@1֧κQ*QN*MS`OFkjC) bp쮦QHvdO P%{048IʽA;P0t^%i a :FEuJ&j|\cߢaܻ1&$,nb<V¤_nT;]HI3aq,ȡ5kBХœ?=>DT=Z5!.Hw`tOKf+4,8-[uU*$6 frLFŋtFbTa=߸ ˽Z,:QcѝϊMOH!,7o@|1OWv:a*! r MigPqALIP=qɾZ0*N#gP^y|4 c.T="RbX'#p̾--0Y90*(VOM}<9׃Rq1vUVI$T=]бqUNLl[gQZkOv0y̠SO\fY~|6矞-somL[ZWkQf|]qaxJS yx7viP)iXLWݍ/߾ad?z}v݆}p>H<]dZ7h>i<*vxfFow|A{{77W?1r1n pBէus9G?.pxqg?a\ MW0J7'1`F?Vm4GFߍߌ) 2Kff4Ƨ2իRA7 Vq~yqf43)@|4oGu oY3\+~9l̒Gוf'~u,7wa:46hB9L . qExq4&˻[LIݕx}6$ք|r{36 T]ZBQ}_V V:`O7k&&gUK(Wz4y>Xs3#:P>L 56PbYBc#Ina#c . K9,XCkd$T}pڰ@݁{l_V0Yg>/3Ps(~DCgQ=XK) jGf^gQ# 0?p8K&,6]xރwet:|2Jt'sOvr`vz g NG=C_(`ӷO/N Ont|Ei.^Rn'RHEao6L&{#e&ݽ<3`md} T$VL daQsƲɲtk}7y6X+oJ!}@ůX&pM1,o'?, Ж'Nl? tR.0%b&V‚ʄvh;Q]/&^Yn pPgwpMAJ0\W  1ÅQ!!"xN%'dMQɯjirBCiz3o6a}ɫRcObvI^jr&mZ>d6Zo`mU{+qyvǞk1q`Ji8.x H6X!JOjfG S0#0AJ.-R &&5>f%\8X4Ly#; @ 0AIDኡ ȥ qy5Z#TD0?sV01K1 P0 qr ``AcZm Ytp\0lpa)2B71e6ṋ&0ZR}h-#eV=pLISca)02w]*sWU\=15LHDi%2 $׎"MX3FLhEX*%W`C 6}-(% &.C`$}wRUJ󻪧-KV%cЊ1PPT& qE'( %`ã $݃ZJzHGfj(B6W,ʘg*yЧkb:; W/K DǟNw'O(!p'ۢiL)Y00;(TcG1 뉆16TJx]*cj ƫ6X(##'t Gm)P?Ӎ~]-ŻvF[sy(݈Q`0}:`e #S ڂVwǘ38@R>@,JT83N}D#1Hm:>*/C9a݃mFǬظU,7'?NJ8:"^z'׎&%Bb'fehb͞> XcC`:b{/]9#$^* DK8N C#P26:%A`*)<~|mDmWF K;ބPn?^-.e"(A&)H5K~s/Ecq08 zEHΨJ%=vZJ 4lfbKmMx'^F/Gn!e`XEfE)"jI!jL6ֺ@eЈ``zJq0B(u ЄB;(ƼeԚN8Ӿ*)L\[0 }aCxUΖ-pW/^}uR9yU3黟S)ADY0ɓ_~~3O3wJ.OSZIQof2afrs3Op}U [I@Ɯje{&tw"!֫Xu"ai|-x2; M%쎏D0l!aRh{u1&GO&]'ͺR"SW! (K.#TMu厏)0ISfet"b1ƴ!˭Q{Q3u SެuFyc D K]\5 SnHW?!W?Xݖryy/&N΀,)-TQ QM%l`g$wdkQk Z ሄ`oG {8"z zJb'[{jPb>-Ǣ6W)U^3fiY4NI&7MQvShmȃkNZ|4mNb9L~;G? ??TdZ|>Cps!0isb΋U+HsI\@_G6$:a:Cޚ<hѼG(TQY/u륲=Y&䓀Dw`M {AJqR&wK`z$Z <"걗XyEN c8uDV(^iKnFEJ3(k\KE, iw1`GA5sb[nNygz:_!2E[ O1 $3yAЫ-%&ipƆsb.(j >wM9W\RĞk.dW"@^JȗP#*R v8_4 # F|USA9 ~v35%v„Ԇ1ֿZһcZR#P]AH?hh+OBrG)|Bo w.j*盰 ;@he^ ho[B`'=\\.Uc-$OL9 2cShc" $hNqJnOӟQ0;@$Trd;Q͚c\6.r-U?StP+wnvGtzP⇇8_/T_+t ~ؖ j:_>xp9ydo,Y^=.<ӔyM[}QxWs{ĠWA&V,iڑUX.ߛetY0W,atĘ^el)"߬ JW"x ^z9ӊJ$t%X#̓|؊F5Q= /zq}>2~ipi)Mn SWQ/nB3C̺ė` @s6b1pMe/ed2psVZq%:(⏒,)-om˹q Sջ5If|V~sqѹh`|IH%M U/dH1+:׊yhi`3* wRV]a{/hhT.xK,rH׉P.i".SmfrYhx%-:5u/rpYu>1Nԫ+7\ί)"DZ{!҄4TpeCQkd=T_%\Z0lV0%%_Z6+Nߠ.\Vp˾8e?UlcTi-R {qZB02Qy۷}zIC)"hw늓DWv\\]庢H+y)g̨m?(Cu3ۏ?]).woyՁ{Ymu^4x߁ ;q{3 '[}<:V= L-De'ϧǭ%U^h .)*95&5_R#Kjڠxw8j׈amZfeζKgWGiKیV1pst9ztE{GJUU:tBU:ys9yAk j}q3Oc^?Q'k^OY +.3yG]x hf^[w)f.ot{;Pz`@_)qEGfOԪ~G;XG-VYOOlZPd포SP z3Ad$W1jgtII ~SB]:n }3͍1q!j^ m=QKCj_ya)klrj+# (ڻh{k/~J !,~׿5]pNfqъp:qqn(4.p ;:ggFVzH́ٷ|kRFvߺS*NBLJLJMUD&ﰿ`Β<Jtb\4nZZ̍>/c^hխg6<=%H!U/V id< J}r p0@Ow3.- g.OUN@gEnAÙ0)ذvmF@ ОLͳ8L^$oT1v!B(Bwӧ) rIs?<Ê2w70Iɭ˫* z.׺.|o?booU6}dt:V-M:z8~eZQ^%tgp-}޳sօ.͔/dMEG9ךޮ 7[wK1$"r֠ UPRR,u7U=qV.Z0Ku8Ƭ.vw01_dRagm: S:z;;og9FIX56)&3 2*yk:)FdJg}(q*!/H}\sF?~g K9)͚-@a(9)ĥ|]Hđ]hAzYHrCc.A,EUT5!gYN7tcz%| ] ̬VhwCy*u&E;n7,/̩3nnaѶ%eNqsO'?^=\n>A'g}~W8$a؁xM<˷>A$}7~h$ @F P@)'¼}~bN*)Kd2psVZqYtԒj]1\f_6!ռZXk⋞ϵK/nvTn\eg6'NkGE>˯kr?emA]'ZN^-~*9Npݶ^T΁qZJ:m+ClP)kvvP/gw?_?U\\X$W*r40*JYJbR̕;oRDA eXhpYoub~WAϥs {TV[9.HKНJ{nV3ǣ4GY1*ܭP0)p`K5 <t3Pmz)(-=Z e2f5Awj HL9ӺE* l5aaB vĨ3bTXU) 8u[DiPG' %0fdfí$ ȋ]Y՞ߦ{aa3h-^AG9`S\(=5r2 =xTJ[}Yj}QէW?O}p<1|xx{R_SG`5QHf/5(U)!8x@k{vI WSEa'`S!t(ZAO&}HlWI@T%YDUee Uu3'J+t,`AY0ZLEN@.%% ,pTr$ &s#\B2T:,qR$\ œC#Iau%uK w6 rM  bLgVYR>H9GY\GH2m@z;i ΰV.A69"Pi|t!t\jT)N|~DCXaPCtХH&[2r r"\mNiԊ&xH,f |dbknݜ^Z9 %t]횫]60KN)>W)B({o`\C\4JTYSC ;mԛgH퀀ʃf{&9ZV'ͧHOM(ZL TBasw\j뢓u!fHRa{h"`#7}uJH"K!-YYBRZGb6 sHn闎4*P)(t!@"r sIqBxܺp0ҔqM)|݂[%hDű9@(F:'/4u7կb tD2ihD:eE]Z$kT@ b*wuFIV+İ{Ӧr*{Q@S4u01bL1p SR@% e,6Ȥus,ؤAihO#>L @&}12O3(5R!33]tؾG]% =HR|3'YNگAB%M*-U*R^sF)$C RTRMp8T=koƖEЗn/y? Cm7A~ia1-U3$%S9CJq-sޏ9G1p(iYEMb.avyDܕNu~Z-cCț#^%j(/v+^ ϯs\b] rl$/_k).Cߦ771ADDFnW?Fo\ޭ*R/._)BO;nSe]3 KVB($on.~O͵h *~> s J]H>'_k`FisoWumN`|ZssJ&"{pH\6HQLliF6@w6&o#+ۤ`fk} B xhBk|0<D#eϐּFmӜXhv bMsh;.U?]ֲbZrMGT2rcX1ZWP2#"#0>g:'[ւW\qtM G]ݚOS PUmGOގ 4t~q $jAh>%9@2QByLNXm>LS (2[))64$BIj$&Np̸G(SyrK^N{F^S(ſvC'T\<r_{w+,+]L| n/xݺQQ];[Zb|!^ͦMipCI`} xƦc3(J2&`x>O{c~.;֗x\]q&5Ht>^ٹnTq/ꌻ;E5ӊSHI`< gC+v0"mwHR Z+HXF6ky:9"M>#k)ELE("$3i\FaMULMA"{>r;$}Qb:dm| wP$ (MgPܹ{h 4O?< " =>1)PQM/4j((Oy99 SXpfOi"@է/KXle3dGaBpUwZ$LF*闽t﷑N76;!yeP$MŠQiٜBu:HZ Jo,3q% ^Ӏ1@Ho7ס;%[e,8^9g7}F<4UT xƣB xTs;U$}A*becŝ* Zw=(&Q>磊@FG JYEۦ9PgBP?D1`=fZcтhKγaʀệd9pنfK7qmk9-b>c/N&d^s[$2o)M!D86g.jTwSNvW\x&=er_Jd]P{gҥYu{SƮtd/tQNnWzfDo`y7; ԨU ?Lt*{ "A(?`UzF9zp6/#VejRqkzhV`- DOMsb9হ|3 ݴw7 q('o} .Mfk?xNޟ)%$0ܐ8E*t: 7$$N)_iyDYSJ9/O?I=hOpKtr 2{pR8 >`PvYO?ÑhK$NH ULv|@p;nfFKv&8'WуSy'W %_o~c8R!@&{Eow4_(Q ސ(WHHQbhmP pmo cd;x/ m'sǮB{>8!+1NEDĶ E2M$b$dA$)znuY2P;; V|GQY6&W }ټ9Bp;rEN+o!؈K=VPrym RfioNi-+U9U+rwՠ]]ZIP͛;(_oFQHAsb\Ҫa.F;_TT] YV§o1>߽~/JgofdϹ<#Pw񼀦ҡQ!/ due Bn7A2"F=%+Y+άj,ExvCv6zW-Wroղ=NŁ N>*'J=N3urY;O)I)?M-֯Mq}տNpRtaP*DŽEkLEd _TΓ<:\Y˽ՋڮDWgm<f+:~UmP(l(;ZZsucnrcowww7vu4Iժϟ$auЦ^mK>ܘODM7,h1; $AIҲYd KV"Mm(ŔR!4*i Ō, ى7$; GLw`0|nb~P)Iz9eb3*vˁgRǒ5EP= z{8FBW݃9ƘRA32ɗ%368Wp6Q]q&X*6.k.t&(PV^r»OVRV,INTIXP&$qBRIҼI%:;75nL3 c+onܑz!t58[7{Qt4>>$2EE&oɳ!&FTA0;6A3)r|{{P9xVx4Z={B뀢%dM?|7b({\>; ^rWĶb4_.-Q6m]>r/glU>7UJVkx!U95Ʈ @* 0OJ`U}Wu/?96( n6M(_>Jf\; WaÕuC5A#!QxFH '&J3drAT9ϲ~м)s>׋r[5F/:U b;]f/X |J4>fs=V`6tGzSn.<4Hu. 98L6HJ$7Y*)C(1` JDx LbL3Jd8CTFPp =;+LP| #S lëF/I oShb0EFqtU0e4+AxCSLI/Bb9c&jQ/ĺ-Jl)WIQ ѣ>baʂj Z/ڀbK@}U|DDhj^xz|]+|}?}18*~F?OUZ+z{ f9P]0~D O92y:d(Ӈ`Wj0a0j=0^, J*4iS,6̩ K/ x`!.Xα<9'wtQKMa."zB |8KW[m8çXaդz`땸mS@, zRcE.@0:G'LZU@!J6m8]%;n΍DYlnǿ/Qm BVk;<wfh[/p_]iV?׫ 9A>WRkV׶e`6gUm(z ޚtVf9 (g/< zRݯ ]@)РQ5$~y1e D, iPetg@<6Xm IcDZ֫cu1 GEбtar!$5Y x AXsĀ,NA#^N0G؄#kGgSiӴ8DmDMU(Z_'tu9@C̻w0[3.¿S\w2:m '*xbog&evO!ıZw?nH_1[ߏȦEqK4"Drvwmz~e[^S"CqmIof8$Z 01rm5u?S9XxL!eаĦ KE@ &l a8dsj^1dMP1>0Y| N/&NV$%Xd׷G{\g\g+w̍%f/叫C@t(@3,)Գ\e ' !8nvV*/ޝ^gsJ'F3%0H$}Z!<e9wA$$+S=Q)[Ѯ֟OuTV ypQwۧ2ۛi|)Rҕ_،m)h -ܼ}d! 7]#tf-%V?GR5pn''5PqoVK3/|h-dkz_b2aГۻW_zPw+dZ cF+59\ӝX=p%*n͆ޅI͈T[50i%L!B`Y%RK":A$M(քZ3>=ӡJ~JFNPX|^,A T5h d{.])n-lcxCf-Ý,}'sdȘFt׺bIשv,/' <;Lp ŧ‘j%!cmq pg_ggVE!̻y'܌ xPGW_Ϊ)doup/5Ki|#FK:1p؛  *1VYaUfIr2 ug:fЎׂ\bkr匞[ ]g8Ν,iDɝy9I Í.ҵfF)=7ӄG (=~RKb~ dC*d)9Ss;*_^.@+i @GCr}Vs\ FCi93|EV 94~S_5V)wk\]/TЭG?^5Rx=lU8v?՝v^2s^tɗ̃W/)*!tbA/s:XǞ:n)j!KM`:_ag:%LV̭xE`B2,KP]`"I\Dd >n 2 u"Jm݂ n1$o.A2%=&Cn<1X>w@ڶnZ7=ž2'n2>^;k~x ǎRr|||yse"ON&mq^="@Yoid[m(x"Pҟo)`f{sKcG޾lZ,UbO$T@R*(ArGA1 Z:}9WrsI)CHPp=ѳ E>LW@ )q`q۳>u%, F[b0{Օ 2"@v$FNBjM~6\FSR!Qy3D>۞N덉Ss^y;S@%Z귶}VFl^ v2ޫMmNMivh 9S9P=Zܟ^ճAΦ_<-h@V/NbTylLYR~2U֘KXo(P9$zIm5w(;౹/T{1,PnZ1~kCf-´3if?GiGABICi"k1jvmp"8å<#_'5P}ulx{4i^̨_LF<<w2z{>?!_urO$A [yb 1*BNju־I: cw0~ dh W*vjYE6pC[[>"jzLq:lg0+ X XU?ݹspnQJQG*X v^* ݩ`So,ܐVO{D3x̵n%{?=f:?nyOX"O9͒/5[$__M1wuQ&6#ѕ^|X˷6DS5YLDN/d .V H^ /=XYx1|w,{*21X=(`2,'ImX.5)@D %$2,T[,J7*Yw~: ̚\cosTi()")&U` Ø3' 0cqR(jB$8ˀf(7 U_̚q7},9"\-Fv>SkFz#WtWM \;[~|EVwڹD %C0>\榐,H yFzfOΛY,cnfQgV8TͭM`:\ wiߓaAmRRHO!Jjߚe "3E4HxCW%8O9 :ѱ1BRF|7? 0DKg'xhD"@z ' c!˛8r)J :}0{gG6OWף$众Y엝߶rC`jg>ޛ߬zwnTm$\>R xJNl7sLI6[hrF1 \ ZYbKq*;_i>?Ź?~WI؋޵6mcٿ;j{wSI$qIN $nUԒ}/HDZ@Qjݒ(?if@%W0tY yHgLꡀ.+TM]~hltIw=gITJR,>^NpђT&Pr"r{H‹>,'`# MA5(W0TﲞrEs*qjDK{DBͨq.  # en!Z5v: Na0#}7M/lHv.d|_>Ii?z_!֑c1Hq#M OI%haw7o^LՇ[N%Fc1HK"!J10@.BWН45:o 0^ӨƱR2]0VDׇJ[};ΞTC-D-P8D o8%1x0feB%IJq=_Au$aJFW E)Kd:QV{KA\H[I`e{ePi3L8vj$#g5ʁ )W|唃?dǠ#T"Ĝ?Y.U`V$U\1Jc+#811N!,)cț5)j b@z쌀X AuZ`'jl0Ã׎Ǿޠv.v|htAC"qV@qH܉$ I$%$qNZT4@.&(U`?شmv'5!XA4 yv>#M+Z0A NGhve!g_9$JBۀ R$SBs*ɴ|9պ4V:Sm[a.!Q4N;ڵa0D~ݍq 56IXlxL+)Qb1 |Ż(]w 7n+d1)n9Q:E$FI(5۔q-v ̮6@5?/efo"qg Hiuʇ$1Clx2=/#]j-_hQgHsvP*ޜ7B JݨSr,7UFsΛ1E <vUSUUG-MXgC4eAH&uDSí[UORߨy] jf-gV ۬h®B,ěXգ)E"u5 4@H9]<@ODI45_\ۺiZvjچfC0|B/VޠP($h &ٲC~1\!>2saFLbN^ R{B/"Qpj$H%6g%K&pc㤃JTIiˬsLrG9RxQ,Y$/5ke &l.OFĚ $4}OYǨ;3|4=c˸ gWW-D22pܹ񇻀lC&peG0F7Mxhsd-CҮ&ɰEU#VQvlm2:c?Vr٫?Cj[me)iCgukWf>$lwxv?z?O;oOlOvIg>7ؕ\_,ĠA_%Oo]ӵ-xh|Ў?3|یKЦ|̷7+%lpE(??H"%vKTDD:@@Ќ'Zyjkȹ'P9!#~m=z=,+ӫvҙ->WgPhȻ I|ؚnzt\~")pRȶp:Yͩ&m,89גp hx~ +57$uY:; [ջix'<6'xi?f$V=ٔ {an&7 i3OFX }?yX>{IA%M9n) `%n L \oksݶF҆mR\үZAM>ܩ JR'O֌^o+nz;ʛ{?Ig,rMv;+MZl~[  Gf7p&s"3m~ ڬ~I^sJE EI;9<_x4sD)ki/e_'>ֿ@djrGUj8r2 ͑TV˩2 =fJSZ{aQhSZ1Zbq9')#9x'p4k\lzSEkw|P#|NK͔gU+)H!YٔRpnFF&;0ǵq&b,a12\?Gj/9ʅZH~K0s4nֿt.GSlFIG\"un^,->إ&8,q}hLpx#zJ|xV^lZCQRFCNE!TCUoY`!ʦp'3kME7}6BIq:AѴ?gTbxF2yml(^'i/ͧb_:lq?wOdo;޲vǓ1LEaCIl0L3!:f@.i5uu>M,zRT^@9Xn!6u0w9xP16nSȡjnnUXnA6Ux0r5ɺwLw A mcv:SS0ͻ ZֻUa!_#g?w0+dn:(e:mN6rnnXn!6%>9u92>A-Xвޭ M4Ȧ9g!kލPꠌ鄶w;-JnnUXn+6EKmspnlf  oNzx[!/s]v檰̚WM\=ksUhV96sL꙼/-PS3-lNe].2IQgڿll<emd=Gx:`"&AP]]-1eIZX}Q[$~Uc,*y+JVB [vO1ʍk}z,~[y.|}\;FiONehNdM&<4LE8bFH$ڤ%3*Nd}Wn}L4&A4Ϋ$ͮ^~b;.SqiEp.Drl?G"FQiYd";MсgR1-s@#ez%%|Ym}سL@ rF.\@"f)x^{%,HKC bL_/Q́.Y֏ z=\\!5[ח|棛8joz;Hغ8ڦj!_>U1^A 'ePg^к j%7wBSš{f[XIF^H { PnN|,p#!IU`J)Թt p*xhN pϽ22;7B2=7=pcu؞b "ŹbETlSbI<=E0N}sϱo06S4R t+JĐR!Jwi5+\]LyQO.uK7gĐzUEJQAJ(ُ[&`էdO?"d\1;X}@Q|SOyy4 8j2nc@.oDq 5ZK:L_9J֍$%ڄX eRXSxd8Ke)OH mIRɡL>-}}1wN6EsѾVRpk)a{ʵi Av`uG%YWûK;Wy)~@Tf~;GT`kD rS>nN 8q⸠Ŧx1mk^!ІJ6>] ׈vrv YZb!`BCpKfR i1x+ ;<])XiIqt[ui`}]ӵt܂Z`lEzk+PQ|ߧ:`T/Ÿ ̭է>!̥62HTiOc0$1RQSƔ!P3"bȩDL<78|ف}3vf=)Lp ZSB/$!ҫ *29BN.MQC,h)>ҏZn]fwUr>b4Oc p'u6(KEf`Ԕqx]:gQg7xK^9fwkr6FMZP0CC)bv&WwAD/~sU C)=O3(NofS};x?rp33뵧GrzFݼ>AM22ps|VXv68㊡SP<(~wV'CF6'3J>[Z=򈽫C|sN@Fwܿe|)A93s"MH fcBW_*)޵AB!EpŀV?2=iOy`~4xM~-ǭe@ເ6r9*Xު%oVi ^}7E۽9ή&U#H!2BL@MsWRTB}.#0lU4Ud]kHG3"OV^2e `DC,s LRNAˋ>UC(*|*D%޹ekXR$6 ,Y/}3PG񍿣~zʘ]Ѡce~\r~yC}hMKjwfZ;DEA7K*.T4`e4r.Ě{j20nfoWf1Vɺ35aLf^QRn_e)E'ByvMz ݢr15Pe/ `30 r6C> es(BcI7CwD@=-qy8 Q&Iy_$*+A¯ x.x՛@A*Z-oe,l_ 2)_|q3x&eRg?D-ei%tF+LQ,T0+H3ؚǔӘ%mRI*93$v Q Tqc q: ㄧ8ESmD1F4aI[(55'c9_*2Ę=l]&Y-89Df2cKZ@wˡ^R;p+ h>.Q!)B;\UxR]~h$F˵(ƐfS NzPP u[5Wt(ŠWNqz_V/?HezCCWΚX1ܓ!*;W !& pZKc<"5Ko{5*z搬٢9eT5@Vi|bz߬rӅh2ISAqW/ghG^39"Y凜AY 0[ Hɰ '~:&Q_>/Cg=#+9*X)U<%)z>[R5ܕd|Osta$uҿ?TD'W-x R)tgDL.}xcrI=[$،lǶw'@>Ci9QDAta LFXtn~jXOȳ0p7C`v۪LJ+GE~LCL=I+:m(>sPD&LZΘx+j7B()D(>3ٹURQC; r?nG 9eٽ.f7^Ъ1SuAcꔷQʔ3HszԞ؇ӻ\|G9j}z>6),"OQAHvdm[YB/3EK?[i&<|\"ي\R(]80Xȳ{{"(ҨY+ OY]VW,h%Tb]=mE]LJvAPeS %U\!Et3oZ Հ\sɎ(Ƣc´6*kL9Ձ[tNҊ ĕUءS |@JYtOZ%Қ^J*,O)a 甛1mč$m^J#MUԚi OQF!BJufi-?k;m6Q"h8l6V9춍P hLZ9 ,?_ r纗|jA+q"vaMR/-ւ{@ {9rT'\n,l+IA6}v pژ̽t /4Ynu6#k_0u˽zKeE؍b,7Gδ-GY)O1бZ &l_/'^iIƯyVfTO":M`3菝qbkW\)*͠ 4JZ{Mxԟ{mc5RIf"_J4b+:!8Va|Io!DŽH#KN`(5A:ّKRR.9]Ӏb[n119ik 2dg^$HF#QxibX5PH(t$މ^;Fhxq:/ٰOKc9Z"`o`B^41:В(finB.Y3I 5C"|t>k+[ vBastԾ--[+}_n@*@H7́jiߥ:w4Nj ŨCsЮ5WqVvG -GfaW% v{q-eۘy$r/>Vml_=' -jjAbk`GN~6"9\ ? f݇٧6Hj%hO>r]y:̂ f}),dD޳;a e#BiZ (WW;&NLA- z̥d/mu-5VRPJRTWLKb_?wrz(}$pwX0~&ٞ քM]=t*ˆ`$Yჵ5%HpS+Ǫ:TSpq(Px"g,}-tiu Ȑ+{),}B|6GJtO3kl~rQ҂~O<(4DŽz`/}ECLQKu:`fP]h1'XLe}.5V0%Q%/C, =n)4`)Խ ӄ0zjh|$fw!2O-D>1WQY\1{[^H PVu1Tbl84;Rb>iV`Q=fE}7A0-8ɵ0!Ra;]g!yFѦ|U/ds 0&]E2(,D0`\NJh0.,9j fGcx8'KLgS!)ˍ1ae,\E4f@po9^qE6tX&"GGf9_ޖM% m8gAcPUXx.lTv0."PC=!RQ3̃{߂(aXqiHȍptaU*o5> }Z#nnI.Y56S' Z雙Do3 MX"Smt`-dQ"k#kxDsa!0HzĔVp LCa˧h Y 9D#KK'^yKTL>{l. ) G+›3#ȩוYtLGeU b]"bO ?Znj1妨ZOԥ1xo<ksV`E!2kq 8",rcyq}\4I'NkisUVaɄNi?N.*Ss 4L҅6ӍWpm;\5x~98dn ~9=PoYW.ԃ+m81M ?t*ZRVAAQ]WaVCDtî:芠!fZJQ2! 2@QZN ^VRekD)+-W Pxa@汎%X I%SH)cĄI!$}Z!*Ԟr7Pp~ 9լP0{:Yy)Hܾ4f2u T0ոAț_N?MTyifQ3KWszW|2نcCztLPe?$Ý9 8(j.JGX8fxed|klස;ĩF!DBtm""Enlaq?kdeƮr1ӊ 013M:mİ_`Q ,69hd-2Ù%YSmsn^9d6k_%ĊGS:DG(qQ{;I[Oے`3(Pc k_ $+?.E( i!#buߢ~E? +XRWmޝHy9\t7?όqj#֗;hL^s`,Ld_6 <:qkus{5r1Ph]3)x@*:[4飇jjN _ˍU eg1~vG*N Lhb0y,50/ߦ/9F~0M{ ߶i2ke}ھ=7""}Dȋ}3'z-ÊMzW2Si߹īӧ(-2o֑hs" h >.Rט-g~C`W Ajl4ꖘco"]tH3%97ِE-⽑ `qC~>z ꍇ++}kž#ܹ9O)l{7_y ZXUjC_?4n]$D#[!O)ڜrحGVj0wHG?: K=Z"'*)"ԍzSo_R)zL];7Ʋ76F1㲵By﷦"iIXad'&ok4Dx6<(9]v7:ٲ~^"GMrIrTc"dhzICYK$s\uHbs$AcADB(|̽$'1Y8 tC0WJ#k?Ϳ %],\Zr]o_V"ք99.lJ&h 3yE u!ة$C,њH B=<B(d) =!v'vEu![m#|O/BJ}C"4u-3xՋ_0pΟoP_mA-Fe5KiЂ!-ۖهb:1 -kqSaS*Ew /vzksdR8;AXS [S9B<;X,,  Gː]Ժ {%G(Z8G=ɞa̼.Y%hfCҪW VbÿO~I )"MrAY5y=/2s:fY'4–z+>i1v)W{Qx(Rȗ:ʲE2?&~ s " aT`LGU棿S^qy<1@HǽU}N1EYŜ~Wc] C1buܠI>%)e'ozr3GZLO~tǬHZbՓܜ)^?OU#-lRēՎU Kzrbⅹs8MO9ZG;cA(c/ 8'RQyB4IHm\@ߘBˍcԐɾ~gyn'Xk 3z8I|>(ܰ \z#q<D[EDy"]>k~1~γwF"}?];%~9}2؀5\K$o?{ƍ K/#~Qb':{]IvaR*s[$*m )jx0p,vb@@{ryX*E +ؽΥ7S#Jtmr݂J#@lMYM"5z^-h$ꃚ\Ģ2gMү??nC?5&݇jޒZyroS_ݟ"K9Yo~vs?;=:C8ݜy~j.&3w3Y}z.T.݀gJuy(&9=ze[E)4.Xd bʢ`+漍cD۸d6_rA3GCfM*S3&H_QWӫjW3PC,R$XLjAU/~pu[{WwT[Ƣc\W'qsZ?T]4&]բj^iRh%)T4/u~t3DQ:zl+RayÓT I>PE:j:R .ѢNJ M0&mArNY5'n5HKwz%%ZT"_iY"MՎ6kisc̱jv{s isOh@wqE{~Y:׳^pORdD` 7w ՍW-gL6တ #8M1x$qY9* bz}l1{β )kRhSHBPPLe-#EHsB>?A. EHPȽ?NZjN;߂^?_IY'hR3z}A*#]L]XX[;HqYgj%yִ/p*&ӘyAp6+2v-bU\)5 4Ĭu:gw5?O&,)jq҆N&s d !5kjM}qf Hdz:r uuD#<0XVDՇ+0(."$1Rh`ש_~Y$㜀+0Mz3"?]u{UEJ zQ뼶FicyM]2Swaymsx$T`BE5vFLzTBKd6ITSx5ޞ޸yTt׳ć;??M&W3d0⁖ow>M7o'qRzN[|͢qsC^[)!_G -죿UmS61yUgDA|UgƉU3[uoc ?ށsh4_shdh'd,krhхB56C|8uXckDD 5=j'>vy?[4ha()eg|#!D"Mٿ_]>-$3MNt=ǑƞCD6lm{P VÊ6'mhs>)zT&Cy~s1s2'#Q<+Dxe\\gjXjĜ醘/3j|bbb#7J0A:>#pcmL8R1;|?邏1zOY_e"w1fC)LV.`w1f$w9[hd;3.{0FIܲQ2B+#vK!dEs׫[0$n+Qe f^՜5OZoCemu2,pN[ZworG-Lpjw搯4 ;٩/`njwqG?>ԥ`:VB 7S0/BuYr ַZdt.nS:˩[ow˲S۟RnƧԹJ`v=8FstlΟywŌ4qBFOm!43ʽw>5:zݜ8FlnHd.ן\~B1 lPy#Aqt9B gG?.ѓyt18 ahDoJX}mf^L8Ʉ3O gs>9UOfqEBdЀBh_߫0S8s1} D>i8Wd4@O(,7 %U RvA~s ۣpBpSi@^"yv!I:no w4w 5! Lk!ԈӼ"uQׇg)mYJ[x56:GXD ^y-X'6[RWdy-mki{[IRj~:+MIžucW4٣:IHyՑt%LST"g^F.s ӬfY˷fMbDY @<6<HR]9CՖ9=@% Ĺ%85DI#S蝃Lʴm%,{5Z -;aI; JQ?Dg)K"J dΘZk("# CT%f@' R=;iE*ad8y  YU` ϩ8`)3VEH&Z6G+u)ʀRRˆLlD|v-u$Jَkx0  F||G:'e69t1)$$Zjc5WC$1'K֜7Rsf_^M6YtWst:m:ˈ- 2NxiK,! KMXH'W1J!Y-YT>As 2.hb|r\*?@<Bրi+ |-,VcGƀKɈ|fYRľRw&vAu;ȼԌ)iѤ=.Q8Rg[SEiDDXD&HA=w^Y{j"K"* Y11'HTFU#!w)s᫭*-adHpCvu%`e;!+A+">%7\Lb e(Py N (IG 68rDsP*̒F>[r̽ gt*\jҎ(0pqjbJe"Ko %4i_ߜ]YrnYM뽭;zP'~VF?|){Ȼ5o|u[O/t "*Df//#ofmӫsXl"O3nՂ[s?s߷_ nS-?=KKYR_jȥU1ka#c .#,.$^##zThȧ;)t>wfL,,)Z~^0o=? }vni1EBiOҗ0/.l4S,yٳ3f<^vަi:s GZ¼YaD>`XBE. ъoa8KAS"j,*]X"b,nתq>h r |im|Z^Et#PoQB^8D0%Qn2ȁNim>حxB^8D0qnhV*蔶цH7.c[y-”7E{ f#g^(xk$'{nIZh=>I@Vd,a Y%{7ET(6ZE M$ڧ(kQD CSs&n2ȁNimv (h.u"LQݿӾHD3'xMxhhCT1ZJ- y 9*5)kO&1[ +}:7xxjKԒDGeYAQ rȥ6^`֘}ۿ pa[MvE(BdNmN{{;@ hǹnQ̖ :maC"h0v+h.uLalqV{-9{r/-I]PiTR{R3j5EgZ zw(i Ie"\tQClTY.сj?v7K[^~, hSCM54U(IP% Ug66 KO46Da ?}-Rr*DɁ>%Q ͩYo 坟sϓѿ>x ,z6iF }zB2'; ~w6 u~>1aJRuGAbyr~5=D4wqd*yKwH0' ƌ6Y]:qXe ١Q, B⾅?E6 %mǕjm4S`[ *J(5"ML3D9ɰ B$)N!b9ڐQMz]Q*80H^n׵ۚD[kr50GH#"O\M1ܖƂFܣx'+ڈ%^|͗^ qWi̿T B|3TxHu(.\Jx>@Rtf^ڪDhٸ"/Y4%KeGVW 񴏙PvtG*p!1!m* KoM~7 {󷠅{ %ޠ^a`>L (} y C2 10(bFӄ8-q<ZҔg5)NfN+G7`9xXISZp4r A}m&&a0Ķu>Ӻqo7ړN=<\y%"#y7&Mg7WABTͶ1[3ҋe.bFXݯ.XfK| ^p.>.1cFhx [21#4»61yx T eXlopHαrap,EDzhFRkD6U4$#oq03Lx䩄Rw݉jﺓJqXٌlRӐޟ?w\4UT 2lRL2,Yh I(-O=#3F9ap]AQz Qx(jnZ(輕C 5V1VZzצ{>~o&to?jᐲ'O]kb-4J^u#1G9妜aRM ? GZUR6RSKH/X DC(pRipHQ;$Y$FO;X7ĢP"j2[xxUV*uDgǣ:FڗtmjvUgv7F$tדkt7^FR#8DH0'gḋ-vc&{hL*BgE'G1{Qd9ĈÉQ,]NFڇ^~묆栞 bQM:,BN;'疳2.nf߫Kdy3/L+ =nO5"%p.V w!vדZ P]][ΜPpNث⏤Suޝ,RHBY$FkɄ)Vѽe[p^Ec<#?U(!Tρ\1"vF ^Z4ZX衺LxUiyX/XF8/l5YZpRq~˜dLL)3΄!)@tQ3J PpKFQ;XRx'9d!/2¥R8I s Ǽ)(7+MS E pj4!V5V@! !277ÐP4_FA }ߵ_?sfʥ?-pIS} =֧9 Hf>ѵ^߀7HM%xz|0j-1/{ xJ۽a:p3fZN q$=pI{5XN&0TkpZH(=X +aB~?J2E^d@=lm̖fxp)L`]CT>wg"tԶUɕ>4$`HP^d[%R^r]?%7D1 >1&;n k* fjOUy`xT$l/sHRt4Í?5rf-p$$@E* S$IYs6.-|cV 5t\my_Jp%``xZUZ7X* chO #.gxE#  2`4m apX&S`;3FS@HPByW>MҐP-FexX>B"Za2'eRˮdA`@+&k.2c K/S`" U-&\b,yg[?KH ܺl6.*4bePԒ8BTW6)+ȗ޿X%4̦EbK%onBo&k-\s*> <-Z+OH HV v}mJ,XGD$"?ޛyr}(Bqx@8fdv4[߃|z&9gwsk}f.MpRmKyRiCQYH={^=>q^? `걋$Lrx }4 mMK8ks>c05rh iLƧFdFy; ~B a\o~Hfd17͡=[ҟvrD(4-8]b1\T90҄1aELT 2ti[-XcI]<HЧ{\apE{OxWF seR("8pNbMHgt:Ϥx!jH>ߧD՜hݺ*|"6~7&Mg7WW[? [3ҋg"ЬAű`(Uj)6\1M \L`zhJTJǀ{L GLe!30ņ:b D1̇iF~#abI F%liXb퐨5I$qv\#-)`]]QGu)V{`NbTɸ{J!8 ŔDm c vߘs gx5鬗T^,i 5bTKZyPGɮG#1`]J]jKy8`2`))8LN9DNpԱ9XCJHTt3T-. Nk{e2] No4 /i̷ܓxMxPx.gda~a/'WތW_k+W1 B7@tv"Fпvg:aoQ~*vTX0OZ_0da.b]gK0д3mS<2e:I-"II($̼Z@1#H)jH~r @X#!#sHۿ\5d+G8NG}/1]Z`=Uw# N6n}V\ԏ^'ˬOw8iw-=cCwm~9IZEyٝMxdG Snl[}<̮U.djFOqr3t)cx{?~r`r b9EFOej/%!2 _0ߵgM!@St6* v2wK_+uV¥Oxhti䡃MGA #ݎgGP+{~NqQFיo֣j~ 6~Ut=OpND[yr9˳?bgź^.ӳN./.(%hskKtߝ>q\O3D*1OM }j2ZPf]E"BkumbNE#r.2+G.iJ ~Cn'kB9oQ.5gP)&JdW\5p}QMM2ZBIi%I4n08k7aX%!0r*7M:6w"O.]M-͊#;3N6:-VkXiZ+ܭ%:%Kd[G Fp0.U(3+7FL)e"HBI$#AP@합«PjNQ "4.E.2%T2ʆ bK\70QטVk) |m\#QPp`n0k];w7Z@n _?W.7' |8wλw_#p =X=f *ֳk59CO؜©|tx XTr|8ވaW}/`u@doxԚ JVBT9iYţ`N/ߞ)aUͣYv\EYEd2/YfQKAx!hw1&$ə2Ƀ)jcsIZI@sHҮS]d 6 db#cL:G &L)Ȭs,RހGZZʽ)C4W.@6ݗrх^/w |CjU࡞rwwN>{-Ih3NZgwZ Kws89֝8l?k &I6}ZϕFOkTBŀ|7)ZD)ʝ!0D16Z"uF.<@I*=/kP?Vbi:|K&itQY=Kgu`Qj!K'&du}KVo< rjq^sQ8_fەOTZVe?SvyP0^zbt jC @[c`9k m:ť'u )\N݅qc_S+wɭ$%E2U{ոA'ŋg(e]7C˷y<Ds?6ģ9mky> M!=-~EW'6j:i#!9Gi(%!$eIܩ|hA:&S,jks HC ʏYq.u ,UD`rʘ]TͲw, Qv<|em@Qz<~oЕD)6T S`ƙvc+nk%*Ze֦5!A*P+s4MՎwlnƷ0M~&4}vivKƒ!4eՇjF;C%X"UZCRhLZ=~X"u$1m{ bF٭L4i6oj4FvW^?ٟ:.1Y8sON:$g3U9K*Ln>>X8Ƅ6I?/]\e*Wu`1o+ɜ4:y=Be-*D"Β$@&N곿\-8Z0 I?$_PXx%XVXrTXnx %5UBgSk uI(Xm:SX'?7@^ A"G d 5hN(Jz\*80,uPMTuy/@RI"p"nTgeH%'&%Kֽ*gQCR اxb1\>CgVn4UN߮c3p.r"1YoK 3|Ω}ID+_r1owa"GS dg~g 0"|ٷOWٍLpp|@Vz}8>cA @6\A*!0Nܞq*0FS Ct{> Ȭ 7pt !᠘uEΗFn"|=26qP mc*m.|P(|f(xà@&1Ɛ c}"\#VW!f/!bc9˨d\/m)(x}ROBbf#2/V%tZW0J%-{m85( 0EWrdSrJAmL(y.-U Q[aՕS*7g)]2꼺?.!u@]@b $* V8JQ(d\pˤM1ADPɒ&S^)%l1'mgJmG.8S=O"˔dhMr\JTϞ9Td҉لkSz8o/9+ ~moq:NZ**: #LJ4B k4׵,4l]_3δzpYh4LwE:u`M>(g](@;mYtob[Lv@V&cX;a 7v|u5jXY1jbnU[UQrDƬlf #qe)ly,hZs5k Mb-VlGY(-籅9Js=x)hlaU<Ev1 \Z&qz׎kut3YkҺqec{*VOh2DQL q&͡|U+y3k2kgJ5a' umcHSok s,8Q]NݼLyKGZo?a_?ޟ~Ik[B6o r;V9㢄h%߭E+{|ac9X/_HW+z;NrM{|&v\ ]G q}ԎNxoi4|o Odf,~n8CŸ>i (4=LSc7F?iu0#U8UuC]6iCܓ٪xE(bѽ]ݛ){5tn* >M<`YV9U7=*6^pKOPrbKo%ڹV:"Z}qVJf )ͅb\>t|֟鞛ݵaOZIsbaεӛj{d'֟;8 8 $Q0j[FN;bOYIz K!' qdcHYQ/;ܵOdEu /' yaÉ\+?@C_gObMQ㶷4=F=dpj=zTkbDeDx C0&cc~c/2>s>_!*N+Zd0P*"ݤ:.jmE(u5VP<˛qz= $}O0aV(_׹ vw~/ߏAzM8u^ԏnދh.Jn L{*$'*-.mu~$ԡj\?s}TTc*.4Ë2} z% WU(%Kr^&:а}ikhrioIс w~}HNU$3B3Wa 5q!iZehK8sq>?yJULP}Mս5oo=Q`- I,ə]z?[${C 3(LXcO"2_mf77XUtQdDNF&;C |НL#*( EBmv':y zQv^U{}ykDvo/;b+; P**{YבCrOjTF^k4HiÒQʝ֔G)(":t_#[ Q.t&i:xr{r`k{ۂ.Iۂ!Jdk-c3uIϙAC-كegCz򯍠r{T-ic{eP|5bAEƎvLcޚ?y$-MǴ FtV ?S 6!8AG}S[AFNwM hd:E* 6v5M8GjMVn mvlNIsa*7'n&@zC0#[z6vzO~v)6L$\_|l+u@WM[}8k0!Zȩ%*YzVp.)slFl&K iz`E^=y|.5icΐ $=u>E-C[W[} }>ѽ0f=n^цڿh+mз}#vƪa+h[7}9^_"[v> `4ٲz .g 청:aikNfwixݎ'+kʯu,3yMLV@} O׀^T= lzͮ67jr<.ur-VlP&+eK=j@(Y"a )WMs\4~ 0)|^/'yfLK;}s3zxy=-O@rf<)b6_Ư%'᳙O~< k$:J7~,|˕~ƛ'ˉw/zӴVQQQUo~l`eeV[n1HZJVpdA"(Fſl>ήכ| i9~*/,Hћ,C dY~u7PZY,/|妰(&enxQ}ӊǎ|_OElpe拈|s ltzci2?s#͌r̬tc(C%&DJ5fSrRH 'ϋVZ[9$`r߬(Ç|ja3٨G?BmKBx`qLxV 5bcG}| ^팤@+_pM'㗩0]݁venSh?G߾"n9zgN$ _/=?nݨ\ C3if4褰;K`,R# YKRTJ}Q'0jx3Z =7SD"8@J(X> !69$N T8V1P4M6衹m"b`|p}\G!L%5)R.tC(tZoDN@5GP' ˘?E3eUjmO2*R V[Z-8qJS`!YzSPrJ{˽[ᲙU t"e-sɱ$XEW4F^!5CD+Zq̅{@ o= P}A1u4x)H<X(AC(tR6EbB1ToKX@!ڃQXTKL+1*cFjCl`@odc P>p6!,ǰb8`ٰ=2i=3`Nr* `Tdq*$'m8 c,F LqQh J? x/m+a6—DVJo{fjeBaW3 4e) FK A}J&K 98n"H W=An*b_`p%4*U/ܤᧆ( ?=tiU~hф2b9ۤW#%ONkbߜ}mvbk~X tX ~X`tX ^c?bA :t^ܳ0~XC|XȣŲWn//xŒŊ//6w?6^y^KpX{IIAKu0.bHi@ k@T% %b%b7^ƋGˋub~_^1;Z^>Z^{ʋ1bI$ t6iExJO$˽GCOQ?ؓZI޽bt62W _+*9&c?gc%n;iC2Ԍ 稓IR=GLo5GFJL5Rvc놁E e&3*fտ\X"ڕabO7^wgqasSN߳P*`4d&5|4=jOisQ["T';yޒQQtݒz4U4I-0Ի1q-:z][y[) >kz&4U4I;Z'ލ |-:z][Q uKkz&4U4E0!_3ԻÈbPkaLNܭ:X&nɇ6r4U4I8§M*~-:z][(]-XVMք&Ծf:A~(ѻbPGuRﱋw뱮d*I?xYӻ5!GI:¨9=׵7j 1s tn-^rZpF(^IK`ӡQ Drmel\S/5k J#rVj8Rdh[`Ԅ\ߓ{jⰽ$ׂ4j lX} piuZB!sC]ƹ9ǜsZBzx9f4sIKh~}cH1ssBrs9ܨ%HcZʜc9&-A-z1t1sdgY. 9| 9f)ɨ|B1EY!cVs17k ]Ȝc9F-AS4&Ts17i `zrx9f-9cn1 09cn@\Y 9cna1 $;|V1Ctjߧ(c s9ܤ%`c9cn1 ,:usǐc=VVe1A=cY,Y1s@Ⱦ7{1 ‰9cf Nʔ }=N +3_yݥ;ˋ鍙›S0bח _c|qP@cY9fV I%&DJ8ͷO rʼ4t>_{X͍㗩0]ozS?ϣags:i X\Lkci̗Oޙ A*` ~{g$ƴlD%FkO2keoK`A4PFmv3!7ӕ`c,#mI<f?&a' B@hd:aRE6- E;*؁@ ͂YBE"{,r`#O=*&Bp\e(%K\*+$J8{:`2l dܚRL_y r6h!dH B8%u‚[l㥶p3aacEVqN%B!㎓BGY\Lwx&X}+BZBzBB27rq*C$e0BfcL\J kS-5!'T -40<S)AS-7[X,0Rr.S &!*Ͼ,v%p/>\b9)9'C#T X$w4 B9~B3g7%fm{g=MDC8Q>,}9_΍?T``.ќꢼd6+۟9[M w*,V?4aTF5cD3FQ#ˍp}y |ǫ ެ~,;ߙmz:[Hv0ؕ]|_7oa|7,(AmCҩK$>|D$~m7j|+c%b2]9XC N?w4L>U~Mǿqu{bjIUCGA.XSUv^ C7 '&_ ro0gÁwoa9 4C2 H_j@ga~'*8MwlϨQ/W #{?",KvOFxg$ 4ĚOԩƙN0d|&KbojABZ̈Jqz’qAyo 9G?:3 +#vD,?9=yG<.W ?ffTh:!<$8'nsb22fRKtUb,.vnhj~Q"SU^] `IK7?F@47ֵ=E|5XsnT&?w~b|Ad 2t{zb(p~9Coޜ9- tj|^e/J"6I:sX `o<1R5B1f62ί]/k4_(Y| ~hP _/gOOQi2! )BAք 0[{sEd|>2J|AEJLywl[il+V7i[{)P[\)cۘ^~raqs6x,]O9'WE:'J!3o28`16qmi)=ʶh ̎zWeQ)6AxGl;jy$XK<: #K 8p%YA'n7u;PO39.x#fh1.,y:Z ;F` xNJr!I>|w!E+LBHT(E:D2T8ױ>9x,>ρ]ErZNuIej&WF= ZcB *9P e|H*K9 { @a/ $@fӃkCW-`9>Hv^s(L# fCH:,T3C0c-$5[gT [.ddf;%./^HА9f;1QN9U;\,O"h.@0?%2Z:(6|98&}zRrGC;GC;FׇIsd=5 %h =hbpr=(@gVnxi>sb܏7Z]1 -ߴ}7>|a86z[9<AAhm0j&XrPaǵ\0El6n.\Q'=['`L 77+;w 0,d?rfBb(ZwOpjkw,@+޻>Nѵi9P?|-S_lg}@ӡoH"m>*0nuF/] gd`6pm]߄T{`6sҌ2θV[ 0$+wyZgG[Rvv a|-J6f΃1+%JbԆ1^(g2=΄xGz;UӄF"Z)&=J{QߦKiv&\{vONrf`[}2X^qw&^7skd>3VE*[>J<1rE@ 8 ;rыaX 2 !l!iY'zo@/ADE:wg=tŸwyE,xz;w>7:{uzX\OAsVΈ-K~|giZf|SΚG^#7ΌC@fƒrpb<J #)0}"˷6t:mAMn_Ϳ`(}>ݭad}ys-wC0<'ǰ.``s x#X `Xu1> /eeFb !a ?q1HpO{4PX aL)ZZ0 Q#1)~u5ZGff'c@!raGXsxsB!KS{c58}˱65H`@Pc]Q1Ӝ|!‰M p[󇿉V ZH(ϑY bC}Ba%c}Ŝ1pB\ A`zkh>!ٺ*O)2<^M$<(C5o+A!4^l~7]2tGtvuxUIn#uZc\QXNNZj>^qij(䨝Ďq6'=qk`,^NN{f"6+^yE e)֘2XVY`,bѥa޽=t 8~< xG 9oQwIZQ%-d<mal)wԯ{/%d׶[S SByR 죅eR_m,k@֮͵vӋ~ i~ I(hLl Ax`x|Z|p Dgu0+*&%>7Ǯ"ZU{vu&G <\sTKxuEX)=ޑØM<$ nqJX{ATguE妆/~X9* $1i* 1c> xǐ$9!2&Mn;wD~YnjdE}G6;x%o_g?K['|0߂F4w/XwCbV7\5PN6#茐 lx&|\QI魚ӈiNc>yA:izImQu]%1M"%MoM S@%-TMTRΞ\:Q2R*khU$֢x=،Jv=YUPnS{ v O; _3gs1 3ɘя2ȜSZK6q?[)&q% ~%_Jg a(1@#Wҹ\Ckw])z8-^601˫Csz(m^aA頖Ӄq"7!p(vb? ӛN'x3NRi-N,2sI奢DSB!jW̽8~ _$b=䄥 C$ 8@K15x\ƙu+fZ0 cH9Eҩk#Ḿx6€  AgP0u Vil6hM"v:\Cca,OJp^͑_mi)#i~X+kQ VTY30 ;*iTBK 1b¦UL: _ Ƚf:XrfsF&|V5c.V)=~BT LSuaKlYDK<*Wfa*+t% R:Im>\l:A^VSTxh!@XKvBmcEW(JzxSbܧ'PݕΤTzGz^t,|p468e܍ ̿_{XpRm5 -o) &Mׯ&m*luꢅB.A206W蕈M*\egg?]w1 0W<_O`:|ƹ ~: kv[\{vu&'8|!ĦPȆFyM\2Kd\*k" d Usoz*.~1wzN#,(j7aMrצ&]LԘ7ߎA=LE{[v=XT~pN~MYfTLꈌ]~=<6@(NmY`J0S)D`F%eTv.L%dƸ; J@!!QݑOr;vJ.ٙ?XR7P(BvLHNW0IBkPVC-$'wCH~ *NcNe@nY;0å8RI On4fׁեIo.=H,}Ig0C.%PTS OfQO,R)I?JQ T>En,]"T<9' D n"o %S[>tnyJr.[:T=ηJEI3WWLh I@@3Lq#rRIYV6(I]l;+yɔ҄:<~"JQeFvm =q:º`(%O4"cNb%xhX(D@‡e\0 [xވJ=i]\:[HnR^#%8 Ex=)’%X a$EP?KȅuLFt[THA<* 6iȡ=p( Q>[R I9o~Hq-{>8ۓɍ)ќ 8}'Z75o3m#Ksm >Y<b <7vGXƉ<-Oo"k_ӿ<+<uiX-\YE$䕋h"P/ܣvAvۣCxxgڭDS[ELqqNm"b":uǨx[퉦j&$䕋LK{Do7w n&`"1iSwa(6B}1%t84$!U0$0bE/*zki zBO8 t/p {|GU&.`qoޞ XpVnާߝ=l0Ml-Ef ɸ4j:vqu޵q$;Tꀻ ˝gOkJTHʱ7~CR %9)WUib1OYN5Q:~oۇ;*-kp t!v]lZH `՟砾s]cj-oa`ّeV, U i&t  "I}Ҋ뒶r,L` %]w$8|Wx]ۯ+[;>-o.v~f6K,$U[{ofˋ L_*\rC3}o% n?`1ҳi LJ*Ʊc W읿A\2*UɀeʱGb!f6,, `(1FTǠ;G^6LF9O!FӷV=J#P%)$B0Vg Iݭ{;dAJL*8"hD!Rv"@8BQDXQ"h*PGrkT mhb(%z%k8oMd x3; 4V>e:P8E靴Gah6tq %"YA& K4X k[?YP\޽+\+(\y*il҄dcbԀP qTwRjMȓâNpň$A= F*V(\$Ş96yZq9f9 D:?αާSu~99`Aģ8$y뭹ϳbՁdӅnswҎ<竭 κE#~˼r;}mG9Nfn7 Tuξ8}x^-~8_Eʓ9[ʴm SԐ."M̺|&nU*T&LmըES7ˏJ>ןRMё$r,khx4 Zfa: ,zzU?m?X5팷 T ke;?8BӯVm3xUwP].ki-Teg`Юq)EurnXhZQ/!9PP;QzSTS $`ﺰmݔa+*tWB;Va>>@ tTh=4fcٹhL;3W*tAp6b҄>Ek;/,WQ80#.ſd&QvCM=CKqpvFj-ljy,|Xvnf(r›!™>$^;qGa F8/ Ѫ 1¤c2b-yCvGŢeT~yqД%t '|P69R3KHc0Y.b#%$%'gs-N*]ή ^dP,xgg}U36>BWpLxQo qHD;fHtsƳ%-_B oQDMUOv(nq9;u(iu$s/-Z(YH4/5IT<ܑ\w΂8Bb&k3ް2SNJc=9bW,U%A"#Wc)8l;հ2}sKT%e*m#yQ"\+e ^M1BXyl `7yӅuOėWjE4? ~ca&s; {P:!p½"n2d "3*DD="4otEGQE]級K_?.vRoþDIacH|/*Y9gov*Foxc̔ Ob\izx0A4 4"^Vvq.hM85)NAh\*E;4NATȔu4NsC;4qKSKAkƜAlBVz4rf>G^L1b~=L 7s,nŤ>@>$0˛G_C Z7Y& 0_Q(yj-K$DDd''{ pI(RF:Qc!H@jښLH,oxB<7wB9ү+0PspAh) BE01bWˎg ɧ2)$L*?OCɰ_^\8"6B`oey"<1A!9) À!G$\92YSCJ&DA>˅pXsGLxXnC<. apR M5Y3%Jևl `8qrMV5Ekpy !,T_ַR.C}ϕ *󙆘띙?| y4~l˃Il~E+F_v%-_YQȋ志b95ig0mY+>_JlB'fjזx0 ?(۽ƢvRZWwwwsW~(!=+7ۡ=7H끎Ն5+>w9>rUCP!Ba^y] aI *?~Ҍ9{fI{wLЂS,\ۻ7w}Im6TQXO;Sh!^[]7NSt#J\eXuHy#|ξ\&,qer֎}jX=ol4a3А<uX) {θ"L\>iWY`t;א(S˝牧v!zۺ)!ukʃi:Gt dOe/4׺!rmS<{Ĺe0\>ukʃi:GQTɓY ͵nmhȟ\E7tJdHM/UL1 Ȱci5%e(A1&r[kܡVI%*#ш[@uJSOE KA†SE g+7/yf'5YO%zQ7 <þkWQU{(i*_9  ,EHgښ۸_aeиt⪼霓T[R 0ĩ" )RhCR<%rFhRqRThimZ*V)w$]KctbyoEBeL3n.\TΗyQư_>bkΙ. f[Z)fv~浪JMi VYV1X(\WX6ڥL~, /$tHlZn} +#M>Mӯ$qk&hS}c0 "\U!W&]׶A]H5+or䯢NsG獞{<׬!V Op@wy`pFI?[wb@I~yD a.ib5xp xs%8xk1ArBKH/ Dc~#ߦƼI>DU>~XcKWON ᧫m,v ^,&FM40^d'!gbyTCbva{ݛ*]/X/)SJNz3I-T6Yelh;?e5 {ٗ80OmNùUo7~ +~_#IXyICe6WCئ 2GnZVI7ޛts|PͦSfyO&({{d>8,"=1^j[ L'F^v<;;d8E-Rȶkq)s\.Hǂ/g' X>;`@3F%Ok֣GR5%jy^Mr7GzZf0iW!+,AI#₲5.GgEBgwL=+.NS!"Xkm~rǕt+E1cᕪ)UZ_+4@Psc0rzvQ֧ZI-OE4#Xj$i=+9TR(#URS륭ޞKw JD8AUKI4xƔj08/M12xf$qUKS{a3$~3'[c=nDg\,Eac %L(\ZNfLD\8*ADB5[k̷4)Kr^jqeoGNp/j^|m ?Xk>IPg-o *xEލ̓ij+$}[! ;6# *q)-FG[Eĺ]tU{ \|;_4 v=괨Y8ddU_`k3N5NUc$aϾi!-hBNEkn2{r̶"k\́;1CxL&>7?u9ї&Uv"h,hU1!->i񹍴mhVe{}]cQpw?ÿVGʽ]c _PK)rŸhZyEtű&0h?h.*WśUNƛQOaCB d~JnD\; "L1۶i,*W#dqx%L3&SuE @u(FPP닒YQPA|JWyz9+ ҜqRJc6^)RO5oJc_v1W2%:_V Wɓa H0Fdd9G\#an蓊@ [8MnoūU"5}RN@f&L1Xy+Ew}@ާȏ&RsrG@u\P8^Ӡ&}>#&Z,R4h YOJ{GȔ7#j"O=Dfu҈I&V_!RP?&ŃG{ v-l0tD䤶Iq gBizځ\IyYd IdX$9Y_W(PH(-#͔ ea,v;z qohKV$"|v"6h̗ј/1_֍yՂ^XIpNJ0сW*5+Ž$V+Qs8_VތyA[ѻvڬEoGllm~?m2_s4 _U_i,*#c0*7=R(-W5a="їtW*((ZQ: pb˲t(`2Ǡ5կ^uOH!NDa2F ␀`9U{gvh l0N׋.ܠlv s?"˘HFjy4^YabYi#llI(GW H%y$-K J@$VVۋ4wo/fwq~}\W:W~0 s]\==kKheAúJU=| G ->@uw]@_AxQGFaz2WdG;33J*(w\R1x1: @VscvO77 r9 11bCrKdw@ZX!h` ktvۡ\hDՋ,5&xb8X!arX"JQan0AhPndx9nlklw =Sl"O\+CV#Hّe.΅SVN2uՇrb{d bIC^0]pIKo݆)aTOkBwjS^($\NH_I>I%^P1fOR;!%mW 9[ M{*C%$+:ȫu܏j?)cv֜ NeO4Ր%213x4t} gPYaactV2ɇaLhDs^k!g]c)jⶪ`S[o9qcY: ptRI;Y Fu0=~ASV%Jn m #OvXH)С/0c H ޒPqthq_V;DXjNj+Ǟ4VzamhEiPY]I,1X+BH+KVy% 8OC십ܶ"!+Bʛ"T UiYjOV\2zuBW5yJWXVɲ +R|vI];lLy wB|P7MG/-kv >$ä^ʗD|pd.fT` =ӗ슄E'bm}+4!*Ch{K14Kw% K)W ,=NjV|_d,̧TɶCk/T'l9 Zw4Ui di.-VdM];bj:[r5b۬X^k%,']C龛,$.sUۛZVb+N; 9ttO7i;2ÔZIy:qs/V!Ug}rK>.YG`eu@Xyvsap(ONThUgJ4 :{\̵<6mgJgSkDPbVZϕaՊ2|Nϙ r뭍|霳7;ܤYąڸjs>%e73 /ne/3ESjPNZ((%ąhi llx(ܟut̍g(vxO?cg:PmYiǨ?6{6+rB~kD9)yǹQE{VC,t躼Kw?- }zرzhl0#U)R W5*z4+ ^b;x>$Y?D& Jc|a}4= aROL>:p ]g Iq.(yaۗϘ`h<4|U3{ޱ6gYXʛ[X[Ji8gb^ ;OUJЈ#%)9R(E]2D^hFľJJYh seHs !*X=:].ԝ5 q[[=ع:ɅcqsxSFPS!|$"EHHΎcYRT C1w]uP9FBsD]N]B0!x1#)mD;IHݓvF^ʑ}!ufl$;@ SX.]i}\n{'LȞ9iEiڹEl_oPԒosA:6DO`êhUfnH#B$g n.#A%|JF58)UNjV1t*/.@q L:d`oh+ėFƳɕu~1"v^F{eӫ;ɕ{NaTjMHiJ<[pI[E][o[9+_X$W~t2lޗ^:v,-w^dlHI x_UbF '}<{|k_6Nxߡ6 ؼ 权هz`_ W*v34uj;vP)]K̡.%#> #L+^5V$Djm`-[AՠcDD =s.Rkaevc͠0IMaP1I6)8;`=n;mBD"FRzβHIBm"LȂ n\IQ* 4bcʋ$k;͋*b"FoI1K u*y, PsTk%2ٹ~g_ KpBp<-.Pg^gIR(E|ю@u†LѾZ=C@$cCu@IZ!jڦD-HqzL*lV QDtZQ96+1O2:Jɒ&Qlgr{@m.ZwV_rX*M*F7GFU'ury 5^\X4ٷ< M`bdBKC&@ǎ4R᪲䁂lea@ bh}Dp8=;=ɑdy0ASt4ʚD8΋ (b G-8GXдѶ'yky~'|9IBWx/e}ƮY<=X{(2B#a92E@o7OZ6C|Z w$ɖl5t~ag@YK{;jr;^?]]mጴ ̀ uP<=[4ʠ'}Fq7G#cz/yUc e#")oa\=^Yad քN>Q뒉qf83qfb;.*Qk04$,hŠy$Q**{Ek_T},)9ulSF%D)3OÍr򏫏O{9ȕƙW')/UaɷqE(r0stj /r pd6=tk -O)}$Gvc'ff/"O,en粸\1jW..&W #y~wd nVCd<{Tzd+ #b p 5WI\nrR̛+,v?\c0 f\}p|DW V" H#߄ J<}俕 "6 r<}Ȝ*ZW9?B%m~|Ԃsy"Լ~Y` A) "JœwT $ybPNMTP3kIVB#^?NT*;[3P!(,F@IJ뜅G'4ГWT͓rCC QV9`-jd8ʕL1n80J"A-g*D "r VI+1b$AăxRrńB PV:of*Ԁ @:ho6^(uNkv]?o?gR0ۏIdG[k1!Ap÷ԒCe'woFG#vA(yqB@L؈2F&B%6& S+f̗(U6)p)H ДA+y *] T^YVSmpj* fMAn[sZ40ͯkdJ_14T-߃mLVBlM:oWUVo1&am4qіSZU4F9qʩ|L%^Q8üǷgY ooשЪr9hbX٪ib9!)He8qqE$'Hs9|ٟrb|{h~#gQgsILj+sO;*2/߼n^vŠDtjE=ɎhPA,r*C/ysO/r!T㲎Kå9s2y.(0=cvd98358E"(+I{!?>$1UMU5o9 EE+_rm6N:,&= V^Z^=ѡJ 0EL]8y+(wbBȓ1ˆKIk<:h"mvP0sBAS+e qš-pcXt@RcNpqU<84MݯhQ66^!su[Bk$}qhA^5%{Eq=xSVٞDh&'>gABq@=0JFBP h@33!b}85Tq 5p&n$t˗ja9&C#头ue ZmEh&:.,b*@ Fr-ȍUѠ j‘G"e|Dgu3tV5tk:z Ɵ- ؁x8H&Cћ#°`-!̣ZYODܡgLPwWIWC.GH+lp`N }/-)'ZFYoP54ic>qBc 3%5b5љk]dI3c ! &9x%b3PE:S3dII@tpR@C+b}$|@I^&bNH gK'E}HxC$C%F=pro'v^nƭJtQ, nOVʔV›jUp@)KZgw>vrq)ledK\*O&^zXȪsS’V bz7*%}NAz$~z| WL%Ff V{伛14$!-{e U_7E®ﲽ><_4 M6Fg b< sG!F37 9tw,iV"mZZf_-YQ[VFHB+0C/O}74ƛ^xK_•iwt.qλ^>|#YN@5:(* d9Z&v#ydH CBk۞ *OHpi$Da1k8z5m#DS5LXu^r5Hؕu`O⢊^y? t>Eޠu3X:rdTNPl ;`ٲ[J^ݲU ӌ v7#g[wid>ZyN4S;ue53ɐ]Kggge]SMi 'WYkA&C-]8)7<w׬}QUwڗIGa>5# g!Ы^K*ނ:&O"+*$q7I!Ǽ&88а I*WdD ɉT]D+:TIQBQJh^:iWq3,'BE{yZWOvpqc}k&U_O.3k!M寳js}Lb[AGJ"5@ ;N `<됪Y1ZlQ@JRv3dikB*RIfィ#0@{h7H; /zT+#yDaoi ǛC&v=艑a:x5";,A-͔⛢)'`"z@~EN5M,zKl*W/{U@Ox:LsT6DĞ@윰%?ŵO/jY>ZbES-U\i;bVK#fNNmz[yUNQtuߎ^7R_}$w.{#FiM]HFkXRdZH2EUo)5W=*ww2O#( 3ۥܯskܰѺ}rxmV=Rv=G^h&`Z~Aaljp.unnRJOG =KQɚ&w9o3vfSԈozѧ߬^fzD}6 NM×y-Ӽu,بHX 9 Cbr# .)J (q0u?ۻ|ODΜ5k(-a9EFIQӍ~ 9G1\dsLVO - #ovg֜f@u9 " 92i\H5k!pE=WNXPsV^u3 ]VwmHW 2K`_I2Y*=^̿/eY鋤̤HUsU.3#AP'&P Om\IiᡍMuzw?]'R]+j@/\:;U{}s/mb01HJF^+nvx, d2jVG#^$0cSŦ-ud+gZjk f=ˀ˳Pk*lAUp?r,Ցe^hLl圈]cs lLZն҃(#d\ -5xd$2PD$_~@Va\ 1N]?:?:/oܓK!:+/`}8Xin戱.^E߹ e}.t |uHzBW "ҵħ.S lR$-xkD"q[c$L !{(:{ςki+y2naVCj O%.eeKDKtzThFHv׃J׺j@I 9o͞|#/"; lx@}&6D Wkd86ڒ^j&-& YoAfQL㔟eۣ6&3û2-G%P9橏ws6,C?bĭ~-ӛ&V2~,M~S>wվ?bQ$Zo_0n_㧫Sg}FUW\]w;$/j}]6?zwθ#70{s یE#d/hݍ>83ERGg '--rM#]8 goƮҘE?W6ɞsij\ % kL,/byq˺(6ëp{߁#W G87.YL`x~'x.Ҟb>9O_:g̯<Hs l\k*[@ےL:wdE#Ǽ0jYB6-A e$MnNg;3|>$+olf%INY,̯͢tnbE?8(:X;}rIû؅622*4tN6MP"(kq`W$3H{{EViJ;!rR$Zǭo4$ŃιriT4mU$ FO>Hmv?wfyv* 5x=(||4=ƜB" ;`]"1ϏIZuP \CǠ9Vd>}f%aC$DwϜt FL:J-oNx*:Rs1c79CL\n {fN&(>;دDLGOF[aq cݿzAM+[Hچ =^ -F5Θ(IVt` =5Q>c^rj깩/EuYűGE\ZّهHd.ҭ9Jy]{އkn(APƎD}}ZTMS'wF82>9 l5*ʦ !Msamަ^i5h>}Ѝ:\Ndrxy#:mM6> #Oot2þ:p]f|3Σ"{7e#a$%[r t>^}[ݭ2 ԼHG'OX<延髽 ( Z>L*FW 2 YVt yܲA~ۥ%K?Q`I;& "RvNM8V>Vn.o3 o&{u,~Ɔ0{}#)myc7tlĥ)kn?ίC eJPSW|wO2Gms6!`$9iЦ#Ҕsΰ7tt~RƉX "0hmdC`B-۱&j˦ *瀠 Qο.A{׻{R%eMB7la2V>f2m<i硨+<9I8 X}[^lDC-Çu,SZL ;(3$ ~2v20F +9M&/,7zr3 hFuSlfu@ Dp߮n3_tn}5ܝH]d JqaAsj|ZZv27`RԦO C*]; i>X4GuN20P\ZT ۱ƆVg!`b#W}SNrCXlf=yD輳Vo˨GܐW2T { h^bI:/vWn1m6i#k;ސ; [IH F8`[ jZ퐹(Dl/zwcx'RUAŦ09Z!1j CW$Gh!w5nSXipz U 3Z&nA2Ɔ8=z|zk&huһ PP\HK5KOnvZAdnPlz'd$OA)H΋~-] klh_p㪆VG0+1G{ꀆ%eib%зEkhc-/~v'd9(BTsBlɃ٩LΩ z%hҼy:6{eٜ't|Z<#L{cyʆQh6yBdSƆI1Q;p\9dHV9u*Tr0(aq.9p:Ab@˖} YWOnS8\n|moA?7|Hav i.+QYtӗ[?Š'~H֓OfȿxٗѦ٢]]_v5B]|y [љHZŻƜ)mP&9^ٗWċ禯?zg2=ͬ fr2[? .?,ؗOf׋sr0*3 tpZ҅CÝ9ǤL$UI*q)Js9emT3g[Dݢ&DI졖P)S^9h_>hOP1yJ?YT9dUBlvLh/ X~; u<#*aDQe<HSrrprA^juC"Sm.-g#`-{89(klaMNv_f}Qh\8wɺA`NQh@ݶlx^C?pTIZs2vdM88g^ϼ~ $5XS5.IҏIy^<7l~=Y^, ~/9q#dtޏ_| ˟f^b? "@b/sزªz,@3p %6W{,pҘ;0 *]6@ .tw@& hlD`zmtDd2pH9L+ >ӀZ9^: 9zzxm's h-JYlI7Ƨ6GCֲ|XGWh_䜈/H+iRt0}6gD\o=Ll=G{wo\hQp_t{l}h5)me-g~qVr. G^> 11U&Hjݚ 9`X`N9S_8Idr2a6Z\4 x8ݑ}xꇜZIJ0{H0'BˀڮK,E-uk=k}7D8=mgN8pI+^9RPBίtt6w~T;m5}[e0XO$t)),u"zq }595>>x N9l`4sT|:Y̶Yod56h[mn(rZmg89ԜN Jy:|G.KnZ 8;@crɅi`Jpv!~{G8hZ)LsgT~bHDm<:xg{QZ8^Zh9 x:V~rܐB ZޱMܻm|¯W9oP $>pUxz}˫L'RV6Y, X}H<_߻4V/gu*<Պ8H&q2!zRpR:G K._^HNǿtyQݹhe7.sP1j? ]8{FmvS䦾c*5]{7 ]fM:kүLzFˀg^?~7ըV 7 Vka!#.sLcTLn6m똶4eROm._ƹ/Lb/| b+wksFS &CKVƳWϯC:AFT+W={RJAf\= րhIke9fqP$MI)N޵m,E;uc5ľ }BIGF;KI'%~DGHvvgv3-1*.L`e pJ3@@EDd1U" ǷOg|6( 1+~X>ڊo;^Kὥ=Bйw9P݌ Ͼ2SaB5R)($~>IbY33Hs+t X菍'0 AI`57óAhF>XXa%rhXITrFS;5nO9@ }oMv=4M&tlC،OF X O8 !o yqk3>nt|WrVUl^t PҪy)2$!R#P5|1Ŝ>H )~gR0*S)C^t}pl~"?!t{}mx)e#~ (7]]uiPH HI $XAYl@ex=1RVn[ᅨ(XuC/)|tM\?v- .&7kQ:>Z|#i׺Uhniv^+QdCkh%(W=^KPkJWe5u{a@w|VTUNPNO3*a3Kqu> dFf_鼯Nq)SxYWLFkhI1kh2Sɸhc7o7 "֚m)cD1G^XR5* r)il{ c:ǵF -(Y8Ppl>ДB+wv'bfǟ`G h%3:nv{0PQ~0FTl *^<l_@13Ko\DuN7&]0A99ƪ_.ˍUѹ#|j vMU,;*J燐鹻(H& 뜖2 j{Nr.lP0hudGytXI&KZa4o|("9N;`SBk Tc f-HdTb_ QRo|mgSQ%偗/@q^LncvNA` }vx.6nVR#)\ K.kĜ%Mg5_2Î %h-2CEc HQbC4Ą s*h;|n~w94je` )d#ˎ鎏36dNf /A/N{}{4M><i:f{ڛ[;{}h5yCO;~`zO4Q7ûfN:<|Vh|'9 g<ƾ\lo~ _óJ0sh?uzfxL薆b;fPokMt1k&?q ?_兀FYk<< =3>njAŧi~ Jw4Nl~|ʽe>bo0G٠IVW_ss2l'/az5RnLT\Z稑Ѹ|z|]xnwv|ׇ)F!)BݝqjrBx?:{>6@{ZG{WR{;;x~j94]O>ӳtgx ĔnvӋ#ge?4rGOG: \uTl̓1L٩u?{ Cns>MV`j0^HZŭ,ԥYƁ &2Ш s.xytҝLGH˱Ur^t.:-N٠=C820G:VۨeŜl5{mрY mx&'?^¤e8!Q1Hl4RiNbct ',rQ $4ȺhF5`ׇ!WGϽ?it 9)3Ș&&S%Iz$8F"t ؀RLvGg=vfc3Ƌ=\HFac*.8{aZ_?lܱi L֏EyXg'ҕLL՗f/ux&~z"r@kfqЉ\R$a4bA2kX̍"rBc5K"_y~|&_5*n^Er?jF7'ÍQf@tu)P+ώR%$X'z J2jc, ܦ`reJ$׊"Z`AHjxB^I 5OyBj>Z-hRL)+3&4&I4G٥+0U/Pv(A8J^>[m+K \2Y%Pț `q5I@EK*㬍"+xb"UZ{l-(mMԬf=/ Ԭf=5Y:_TX'7Ϝb2/\*34ˌ!6Ö6]|#b|ӖcQɬ% KPJY ҄[4Au&82DkC# Y9w;S;a1ܱMR/MhЄ4^6wpU dK]_ӭTaѻͷ탽73?_MnZ؟;ClڵB~|ӟ|so(N2%ffl~Ax9~d8[fr9wP3x'YuSEvuKKBb>4z<]3kSOn,Ta 2,"cRcƬWa)u(U"d~&`\@"cQJi%wFx9φLR9PVz DGAR(ב`nnU+SªZz M=BS} EPhj>BS 2KuQF.BS J iw f`x`g,6j&(0P RclLKkZzlRN:=!# aH &*ť4Ze (W!^Dr0u% evJXD%1X:QT$Fy^ZUJTpWv(]*N@3fT;Q*ॄJT JҒ7Q⨤ѡ ࡆ Np#F^[3'K . 6(B1'0KE{% ylA"L[VH%E6T@qp&rmbZD3- ~w:@Aޘhp0Xiw`4^E)L+!h]1OdE,#f`WB@Z X{F8 ÄdWdž dž%i>鄔 Q.*˗\a33N]B.e&0`N,f.BG:[vro#DAAǨ7Q! R8=*Jӂt ;.X g5::2h/id1|>BRI>FښMS9r%*qw(0}< O"lSBJ&j_QdicÄpX?R &Fl[pEjbUgJfEIo]d;IZpDOڀ\Sz"y 9 {^?{OFbþoWa< l vO"׀_MA_؆60#"GUu]]],e*hQiz%v͞Pţ"kEXSsJs4IR7 h@$&9wii"PjEKLWn6HK 2T@jBKb%jMu6"H* WAf)ȬzYklvc*"Yڝ| sفiCU>u`_L*O23@IS.8 &Sg;4[k6)I3 ι2㜫4H.=΄߽t4/| ʥP] <8g\ 8`)9ߥv 4&yP΃5?ωsy<\8 P28g^\.O\Ou*PR衹 Qj 7q5ª7۳DV9Ʉ'Ty뽸=TC(A䌨MӯEDurm* F":~f vRR MoŸۭ;2DŽd&٫8N ZD$O\6Bπ`@w> n8m0mGI ů^75 X|`WA`.ߣ2>W/+d4O>Z[CX01`.uixN !J8,QqED3&Rr̴>Xo]A)-ôR$}8u'_1r hCt wt߇~֋eV/[%AU.kV35 +.jYpt-^px."vYiY* 5Z W?Լj5R8QLa'ds Rz-^һg÷*͋+·8')3bKF-bUo L0Ro= oU3GT-qpQM ?4}[֊JTG<X<14H^G>gTx9 Cy3L "1;q+xu"* =2ˍ/--! j"1ZҪ '`/'=_p֎tTޅ z.LK"tۧ"FkGj!-lSBb՞'ᩬRb<Oe9+.^2gq jFa)+-@W%g%9,k"(Ai!m35 J?%#OK^q䒿POka4"Z7T`ՊDu Z@'M'+2B&VE6@e eJ+zmrj$hlyZFNc;.xr@2׍7oD׮׿ Di(a 45Ć7*^%!|LQ'R\:o 6T uƥZ\3 mݬ.p$ R"sԐD^4;na3GBQ OtW|ʎ$sF@'㤟lHl}-7&$|! l[^(e6w{ZKq}wLv9Cl>so݇3`uh7Vs7p^}v{k{r7{9m=A֯;Ï_oRs2w~t_} CɃ^笓;`v:럵sB7. >mlkdp.7PiLƯ.qrtoo(g\͛91&qD.0 +Z ?=>x'av6w^KiFh*~f d9ʅ\y},lz un0؍˪S2de o?8}?\r4R%jbZ񻼕e!y}~}NbovߌGvv81ѐ{gë;!Yc ǽr, ׷AT;9$fƅ͌Y92hAv/*Ip'!.|P˫tͫŻhK ą3ONo4u;>xuA|܆ uqh-w:ݳ\ouŠzz#OY4|JNNz_:h9_F7om;|WA#ɺiK;cn[C 1wh|}p/9_? 9!#NpCPlqt};7?Z.# :&-Dpe.t N7חoh'B{ o<}q8b4HIbID+"lKVTXv 4|}4%Fz<7A+PL^׹zqhFV?Z{?0ZKƆqU i2S[?/Y[?S[?^mL)uj\q j &YYb  !< G" uM~X2گ2^3T(hl7ֳqޓgP&u-ܽh}Z`uHkbea~ yWlK lxh.8,D,'Zqup-9'1Qp3X@POuI_\Ra#QƢA7^e%4]6\j[ɠ V~ָxGQL⣝(g8`ǫRo^/0렜^ޝ+ M|f72cNM0>4]E9o cF7 !A<%U6A93vwu4Ar*jR\Fonڊio``o%cboc" ^)K9Alp1񔒩ĆېziR5>HT8 ΂[F J~(n\#d^+)4n'HBp X2ӌ$2ǽ%G`ŧi^(<Ղ@`l)2`qF?s<b-%j-f`=)g>Ot`MSMSťGcZ\I/pj߄"ȫ4K+[)4`JשK (WzRKL(튒~Qf a,]0#%t[Aw`#T u;1ôFAA q^a( -(GʾkW79lIv z A6Ŧދ0 Y^ޑ\hMEacL@a^f{ϺHۋoNc?w3̱3EpKfg/Bk H&3;)̘9 0f"hHݱh;0ט=7(hRl,Օ ch Ec)6oɳ#püM2Ul^sA9T]\"o4qR+ /N˅f=tҹb3įA\hq[50M9Pq39i q Tn.@) lh.00i+i))0l:RpJ+IGLCꪴ-csť!H%|*84h%d3tr[u ́XbQ%䡲.*[.n~!ueWH-P3>ᐁ wopKUe7'0}RB:A[g%˙Ru61+7It@>`$CW١y/퍡QQMcq9ƌ0ôSKB(q`2NtG$移q ^)? &ۻ;%7yws8J۸'v9rdi.)8{EQ[<V % fx,Ut;9YͱXaut +]ޠ尔}O˅tL~qcv0஡t!3Sƚ ƃ)=D6j0I`m0Qb; 6Xs>,-O5X%y, _ S讹bR_@ @TZ9t1oV]P3$ma&*)f_"rHbLָ=0 t*d2)%3 :/D#><v{rV&qT`)߀,Y2I˅a)}`8K5ະ?X&DJٸ|PowPCRN%G=TJɱX}8JԜ`<,̣}vƅx:砒Ph3A)32gksACeTrFX:ȾE2`s.!S5!JBdgZKSzщC!ϣv{zֈẇ=un_Lbjm蠑n^5ڮXI;t|  }T@귐۷V?.!띓50t{|FlNhM%EplLP4V5$ScUz\#6t·o[W~!xDt႒5'eff:hG(0:]2IЧl}ɜ0Q  ȜJcCOM>&Y?2"[ ^Ud X X)!61뒉N6Z9:VfKA!jU3 2G'j"ql PLV)DlrhREꇹLbTlaVUZ3'(+E͐,NbFt ׁ;VҁSx;VLn7lAAE[Qxvi6Mޘ q?r#|BxTnoL#oћ4o>{ߴp%V=75MXZ5>q7N>/g~}^Wᄐ~LG+h,WgԳ ]oJޯRZE^~V_س{O94Q9t.--2<%9y 6ƁrCѵ{j>#Zp 7Ƹ9yhf?4I7 ]???_$ޗgMr缰E=z ?G9g.uPŘdZ gīPABڱ_Se/o5h'nQ 1Kv[g:w\5,#b㸓|k=.&VS^}M_=j&|cJ52 ܼ|p 1ko2ᓭʜwX7h6@9o?{@;q㿯L\iv}ŝ`Ёn+Z72cO&?`a$%G]~.MOوPeһ0]Ճ?B_ =K7SmrWN* ˖TdI9 cbv:L4_ӳon^*;W,>ӳ/Ɠ3PrVǧ1 ̳ۋge×#>M .8NAXhKh*2ZZ+/uhVn*D 6CJbYNgfi٦)J{SUO'L[ ޜ*朸Yg4OlSdݶv$}cKa^/yryRmR(""TBV([IՎ!HRksYNlCn]u>vС']2>&y4/0 uV?'q/&URRYJJfBbBmm($5m 4+5|qB2d6\;:9Ba}ԧt3?iY2]VyˏbR?e^{ )3vV@M7ό˻#Plԟjw`z}b gV$uɳ!XC*P]uA^'/7/.ӋGb//(9@>,|ƻn/_xKe NE9S<1)e#ĆzrEbR=5R 2Q}hrS 6ȡS(ZZ9YwSfĞlBm#I5p ]|CZg7HA|b?cQ:m[P*` hfb x{ѨŖy[<3g ),rrf^MׯB(Z_N.I%ҩ)e68Izt `l¬)+K-SB*U ֹ c d-]۽٪^!~ݷŭ~HxȎ 7iM2~*[ib2{(I]8 A U YƌmlK\F<1&/KD (2#h]b=A^.:PmO$HCPc5n^B/nt*(o҆̍`UȔ<6 mͫy;vH5DW~ASz*΢A0ݯֹ5,|pL=֮Lv=: 9aH.U -ʶش%ۊ7mɼ>\Ֆl{S^|+Zl+\%\nY<.FruB{vJ.?:F# /wisVzMk1AQ2xV pOp\p5JޚFsՃ2Cg'aV]Q0d&\yޟ6 duRHYa T0́RhLd:3.Z$7!. t*uhpJ!/E[^7ר*\j#PIsIдd(A ܾsJJS‘}Gqf?`RX\.}$Ԣ$5W)7Z®7!缷M*7unOz]!k:͹Yh1׍]ִ1ݾݏv~m@Mm9]jsHϴV"QOD^dxrsOt@vRX72Xpex^0|~, ͊?)q1-H! K{gօ}̈́L 9;l,ւl\U^㹈%,?QƳ6.[[?ɶF M&}Hk-zх;0UUwc{L7!]&kE~@zW2RP0z=ۃ4~SMѰ|uDRgI`օJbgb1|X]!i((-~czh@1nC|_~% Wx?J*jL%_~6Vf;ʖCwi 4WP4?m4]P&/챘WT1=ͭd zE4yZ&Jx$ ZPI;=_6TVU |6[6ۃ*6'!'&.-ICpzgFuPqGd ^s*4hS] 'I<xs"KcW= ٠cZutΐ\f''3=(<#x[_Ψsnf/RG@$S+E=֩c c{=-^ WȔ)t./ŢK$0Hz2RTu.8k) piL+rosp+FDr؋{(Á"/jNCR ȰWgڷ>aqǘ7_VbAI_uBC;T!qεS5tU*yZ*6r2WnX`gqdPt2~耥Tak\ߍSCt٬+G0K1,K&*aD4O8B9\%>>K-vKh3_!}L)/f*{di9i% ("]Ir )}TM`,-bHkG8VNƷ?jz_6$z*?:pB_$X/khºyN&U?.-Y-Ua'b:$3),SI/>8ׇks=LU.4Ek1q5PE+ R!CEj<|{_VؿE|{aPӰl3z/ˏf$o|󋗱 ,T3ye&I j@Pb"F7,Jzؿ$.*RHrDT,DlQV{+q84}tcg'n^m47Y"H.?j ZGZx.X0IA4ڟ~=MB7^LδLδP6ˆ8O(4&*wRHI$1J0c/ íQFqd6ިa](t:M:D>?qY o))]o mH2eŎ_;N=$bv|s=c`{`3d(:;樕ZxXʏv`;LJyxZ~KɼZڻIH ,X;="klқC~}"Z4MNNs>fM!+~'?*'JXkɋعro i3 ŝXОؗK- N0^}޸aUHwޗpr.?}|E "F%#,I,2hcDBo6{ X,1(yt̖4(X&s%)_Z8 9' X{,!-BʽjȋlS8hJdB!cHbgfV3nMi\9X Oi+ޮ^ @8YtZp OOK̺W4sANU TѶuyT1Ý?.I0AHtS0"Sqj97Pì_u>U U1v@RX7[ w$?s@>1LE٠|kΫ#EYJ2{X5 x'd2>:q (jy0;庆T\IgՐqLXl@˲t &I_?}5ho@^vdy:Mm: ' ]^D'nHtD'nHtR^ӱʍS b2D2 4): 4F$I!%ͨ`(Udz'l!$V!&i9 _%n %êTg8"K$|筠 =<-:!F [Y81%P8aF$Q"33{FH*r9įtGHeHWӇEDZ - /[_Ջ W[2)0G/Bڬx-õq?V=zEhrl#rHu^_y}˜%^sڜkuЋGa_8aWZ]9~MeG AɋpS_h'%Nߚ>##/ =GHu1zpH)Kј1>?2LNꠝTgND*r$DKOc59iut,eHIYr6?n%6X[yO ג酪gJ{$ E"as ĭ}%1YLI[n5K W%x3J +&h KLI?: :e*)*1?:csՃLmOث-_? 2­`Ɲhmt0Gg:|# rjLQ$l!x'KߝFH.L41`fT +sNk%cz ñ Y1e\^qSב ˒<Gen 2U8J ͪtf4RY?OM` )"Xe,_5@"*YZXóD ʳ\/j%s-zwC%YbAqNI}D}a$UF*ML0$a ArlaF4i7>C-  Ҙ\(jY.` ͩ&sR9*=wL""DÐp*cUӰޗGwFl3|Z|m"Xƶ@dJ1İ嬜%1wۚtu#h 2ϐgFroᎨ=*) 9=𛦬Rq"/Kՠ[/Q@*!&)S.-D(K(00Tb&2{WX}ns<8 /w q#h /6m_]њUd[^TTܣwĜOh m8uxѱsQbv(1ޭv؀ U%^$' ޵qs"c~$l&Hz ^-HR3]F\4xpqvD؟۵Ag[1ĹFZ 2T4tDZR -3\FՔ 7Q[ν4+o[viжng6Rg{e˷2n%YJߺONj. ,+Tsׂv2s-JjGoтd(yK&ڗ(ڲ詷].vŦ Eԍù$o qĴMnW =ƃgH/m3 6q#i QGXYM,zÉ M6*>=udmn95ǷΟo1{<0V<1t.% 1P)dyƐי &D(3¶,nZ].(O[[l~B qAg~6+`MF!{ت*<z jקWm8y!j';OATR,Fvy|䍨d}C'\&?ysZucg.@X į@V&''! _آ̖Ws []EZ? };g$ gw?_ azb0VyPJfΌIĈ&B1ܰࡶ(4R,SҨ6L%i *xjRrgcZut$eT7%dh75)EXt]FjŎ]ļP[i$Dӻ z~S3U.ĢLnYD`U籊L8Ln02uhGzP JTEk7 a}zec֫ űZkc &$Jehx?5"9.Z?Ǵ+ jO@S_q$؛]J$z"*kycԽ NXI}}D vrb{Gf-D_WY-Vq[~Jgie܂; }#C0-풳ۖjtqqߖȩd-g%Es-ڛhwS(*+0DmbW@Rխ-Y8s׊(R DM/)/F{c@dc1Pf"剶.TApfX;'w.`BX[ КL@B)4E;RF$}9E.#?3CXwMP4S`f7YEL/3'\c(4Y{I',Dt' q!B4PQF@5A* kQԐhcbS|=M^ISl2ZO;Jiu;B!FgyY/2П:}_;bИyPin< W3.Ej?l`+y9H{t0](G'f- ?o#%`t?7s6a +l4ե; `Ya|24իu@xqzY8)7 "gR鋚5?Jz8\(=>nćOFg q 5޻Ǔiupzf;Yև&рNҸϣ0sTG17GfL_!cp:I=*Lu`O7H?^ 3ͣ׫OOBt8tx?>>p]Gg$`s|k4HNϧxҿѤ h/O+ywt܅3\kO^NͿ3س'qz\zNZ3 Kcp?oٳ'[T^Oz^\ٞOK_J7"]va.s K)+~!v{oB~d {g̬ם6ju(%i#7|W,)(w{uxޟ3Py.KOzu?q)vεSew_a _y8̟,"omOFcsS"P^hEΨvǟG jT$'EGn3667NW=XwO?x L<{re;> 8vTPqh4 tG#Ž< Fz6ufPC4i(XqPt؏>L2AC_A(]_uѳ> ќf/WW/|<?&w@BK_~/0{mz5Vee^㋩VL?k״JJ~nros./,HU?=yRӕ{sş5%3QF 63\g a룷̊`9뽮64 B> ].N>@g PN(J >o/ i;ӿ1\͉WvhVۃ|s3ȹ".o@eW~Ay_?}tPE5<-ϿGc'z="1sճ@yp _s@܅H^~J?EOCJ1-ntKNYgzĥqF32V"~piAQ)q&sgiIjb=|Khtv P4*m@Yc #)L\)d4^<On𜹞t9ӛ" %-;ߧ`~K[}ϥB;bO>H>f&m3!S9-'#NɗXNx$YI~M.++-GDIY:':\"+.@]~\!.&  o keyVr$p(u&!"o0ʓ @RIERHQu3n_]}DRfͪhbq3ګkj]-*Ң,-*ҢBpn,,NX+"jW-9K-9K-9K-9+*խP-. \rkk\ei!L*]UPS.0j|q$^/)[" ZeP_,!#=[FMSQ\Ӊ_ƘXEb~ʧH{6}PFs$ZUl{WU% &7aXH>㖖) /4{x9I$mp68)oPfB#=k! ɔRRSRє@AbW)ͨSWoRhs)vnď}uaL߲%'z厅϶)g{UyGwjʹh ouTɮzkJVK^VHc\Z}SJ c!c*alTya[ ^[d^7YBy/ |K0՛[m^¤8BC1ǴZ`j?<3jp~7SSҝZJBNItkt|JaT):5]e|7zRlQcڝ1QnZI6lub|Iרl[\i mew(W{R"f|χ(="|agՁG4<d%XA`ƱSQpynyދvt1&S5ddxiRC*[sD<'3w0'csBNp,2I Jt8B;fH(@E e$XNf J`+Oj֤ͨF̿ ,48_e*u'nLoh㹖TF/(JM?uР{V( dcfB{,("(5 autO)Mq_7-ht%Eg<:E#Q!lC;у!1`,9rB*FE(aji^3Ԑ@(,:Ӝ,bbIDb'-hT RFG=v[B3&R4JO(s`D]"bvOnFn'G_43_?u. ֈb~ݙ"›As֐Qp3`&(>XO,Iqc"5'`;wFq/ O|,Vi㢉R(2@Ua}Q@`$vR"-rrIpin@&+B&!HIBIÌ2rN1 ℥s#=R[h@㍚%jo38(}t0Zz)V_?:"t5DԂ{41Bbe$: f4rJkYL40 a!qMQSX4ؕ~#,($B@QXO0VRéqcWJX.bF!Aj洎v92hp S;5q#Un FݮVM]m%Iv_.[wx|kDCiáHU.ILFD R-ks,us:D%.%ͨ-~dzkWWk+Qzl QܩE(+vCpJybQB^b'6lS/Xd7 ok`kXi/=Ja?ٷ?k`i ߚoܾ8"߈^~x_>(od`|ZL7gf@yZgʝV,'51__..nm^OvDZz'tg PJ*_i1Yu0[*^Gmd~~Ĥ91_EH쵏^|D Jc `%,Es P$0 ͞QAD_HDYA55[ԶҖ grܨVEJIҠrGb44kR z oӢkɷ J\#)nikHv?03r^w7BItuq+<7?2CJwEg ;AH> SI2$u;xNUqF"}fvv"cB濸GF3{~(b͇_>cID<喰Ֆr@^8䊂M)%:X(S4*J@gB-r UKH@+ȡr$ɢB~ᱮrh͞8W"P:"zcC{ud1-A$_͊%8ܝXgX?M$f`pWPG .גg? ,IaPmg|o㢈VQIZIėT6@NV~ed!CvŎX}41dqJkCaΦ0!=wlS1HoS~OA3Mkmj8}3M S9Y@ʰIpt3r%"cUL(,bhQ$.8(1vN>q .#6gXDb;2LknJE$)%:;]~VD 'e[gXFb5Yڊ21tssY۴ęm EnmŅp9}6n{r75&NIncZNwRO&J< Q"x3Hr h -AyA/55sVE%DFfbDLgz<#_7pNyN6LKMV4JVwsV Grxb%ɉNUǷPr{crR%p*Y$1i0jef̥9_q;]I{|&iS҃.uV'jJO3;T=];m~8rx)5Iq]6*&-5uoLf4E =솫,"A4lAM_:ags7ܞr}''ܯflm%Xbkpsb91̎$y0v}C^SڔE:Ni{['Ņ[/>DcϊO]bqQ0:ӆz~*f~>K_ՓAw ]?:'볇ՃH6f٥m&? $` m*tՕ#g/?OE<>J{D I.GFL(\DzbT݇:'̹.S~o?/岫죄"u=511qub5a!4ͣfC 1o?20i뢁46ؒƕ`Wbtǫ߲1̚hd0TXc 5]8b- &{Ւ8ȋQnD,lnQ%w<_OoVv#Wjv}%oGʟ ԻdRGϟ3/+c[+ 3ۧӿ_})N+cu6ڲG;aᡫg/SXM`r\4/}h4J+R=HJ7=J։kW}ibCf5;Z'U{︆&^_|@6t8yl[^d/O!%?]ԩz055>u0Gx;[N̰cƩ dʧ;Nloힸr:Ё+K6\Lf%.H3c󥴧̡%{Q%:}9;lcG;;Q3xͰYK[)h~w;?[Lx$#xS,_{sS#V|g? ?| F:jHxfd]+6Ϟ&LgM"~ 3^_"x1鼌w:/xNbkWUj@`"A%Z-& 5{_ ƅ*iZ>}blƒ<E@h3qYHYqm͏-N Zr'(тg՞L'ƒw ;ӑ*r8gEa;hA#hJ.Fy"miLy5 xiH_{[-i;Mis^6m˦ͲU,uP jmRRfMD%uS]v}}b%]O ȍgzߵ'~MlmXTlFg} _>kȻZmC ·@ZvadK3c@7UeǖA21Q}4^ am;vȕ8</=t\̰|8~~k6<jڶڴ0k;mYs>92zlz :4pLЭ[31M60. 5J`êL $TYLz(,asj3? Z}/B},U+W,+a b9rR69Q3$t[Su $!1n7J 4u# |hS7tk׷z*TV[tpK׸tpKx zNcm(q1 JHTXZrmăx$ C$Vꇞԗq$[CϵGO$w~Q>F pLS-Wl =pN (P_TXuH-U^YU}=,DAaU"?޵1O^r(aA?{_C-6mK#I)wɍȒrXK]rֱ%3pș1( ѰIJ.R(I$aǑU9XPBq-`PD!a"Vef&0V98NbZ h󭆻u^&@EAW!G@h!]<ň@U'J@ZQJt|9h6.HKBE)ԑ ][L1- sFhs1'Z&7S7yQٯ/0 L:-C(_pno1(>Bq[5}Z~uaNIw}r4&Wk=i6uY`ҳ`y.m#w+Gg=QMR|9dyˍOigj1I`ԲݑK`X^VǿŠ7tsQ9km0Ű:)klofc>X9]C7jOe8}3ݴldc׃ :J\kϮ^' c.݋x%˕>3 Oʷ '|U:/?I߻lGw ,^$I?ġy`h>+^!k#6~vr3e&0R ogP.=nނR?0Mv.@ T忘C_o/ᡏyb _,4joEovpA _>M<$W^P.RThC?>6E;?n6=vd<{sI@Πg -@TF~@ #׳P{~ x(xM]r\ޗBi❟rnr~rZwA=8Pϯ0;+]=A?~\˗sRAj뫡&Ii;!$7O_aƁԙ]JOjO{ySgYbz15 0H wec%+NC(ـU:LzhKG9OGop۷;98LAW:rxJܱb1We@uoVCA)>siXM˅L7ħ\X$-E}ߗYn-fDz k'%F¨pR \XPVe?kx(s>uC;Z6l; 5лpA~x <.9߽;x:p^`s eNﷁ:q0p4mкÓ#` !EDGRΓL;P J %;v 74V sy{q0r6[lULH|,2"sT,,2eVS [di C̞n̪g"Sʩ|cLÚicHdw&*`D7V~X!LJ6faRQRjj(lrBJ`}D+/$A`iʩ;x/>/=]moTs0x_ZaHUKوZI"iC?/te@\XoIX^M&huG5ۚ~\(! wxzvcu?iyp""y/-ס-! 6 )(jn44h!f4kŽ#q }Р)y?>]A9[C1BD q}҄K&bsY@ACI ]g2U1vqd 홏É+?~' G Nr0G87!7AHM뷵#lH:W#Ԁktf-Q_/2U\) ߏ\X[58w樭~Ԋ!'+j*8yʭu_qN%e..訉ߗ gS!-WH0G 3A &Zc:XuwI3)rT6F{cZPq;*` b8Jˡ> [cl˸Q 7@!98IP])CK4,su> t[ oz31rV5! ZsITh6Mv3;yz[Pn>fZ*Úf=ʥ#?ǬV;7W+g+haw<\ya,Ԉ bݚ[Zr@75{s747S3)QnFAe(@c9߇ԟ[~5{@?{4۪yt|yV8Y=>Ew1Ezհ$'l%j8J`-8!gy/Pe#tpҚ_M8=_ܒsI4#lLmPG\"{-n(}cWyUQ9k\CPy̔+p-܂}^Bg^Nj i(0|XcATJ-'>\#qd%)OpNθB nW \s6؁-e\yu-Y)ߕ6|:IE7@IGT+cZ2- )F*V))q.&Ć)}jS5] gS+DdELS!ɲ&YZB\Jc葑 sFhu*$"yV<_ ewȬouP'1~H[OK Z0,QiHdV DNr"so'ڊ}9(RI0zôz\[KčT$jcԩ6N81$FJn\lS[SE7>Du_g%0LmTItV;KT_-* y,Z0"$+XQA֩Fv˝I-+ݚG΢RFRёMqi\C'JkJv 1=h6JR pQ}@*Qx!?膇\:kMYW[jSؠbE"f4JH:k?W2U߲UӬ@!υ4oI" m1mT0>ƓYv0Ͳ[!(s;_.{m/SCIϙEvk1 6"4ѡPŌP:6$2Z|6|\劯fᬄb{[ +cTҝ>G6)?B m3{R%1]9t޼ԤVTIzs0J[ :m0]Hʥb;RMe{@<" 7Ṵ;r5a({FJvGSU OfIOS]bJ)ڶw9ִU;^cnw:Ml!^"to|UI J<ĠR \&HPQc "*I8 H-jP*T=>97Yv4 /.7xP\ 34ѽŕ)v&6bH!o&T.0]露0D\ךscITbqʚ%#hv#XRPvA\րIs?FLQ8ch# 䔓Ș4TpP 3e(q(6aL'YJaL,mJA[#$ #Y1|CbC T0DX*+RBE0N9)C4x0*sr:K|9) `vK)N.A$`l L'c"ɐ};RFm-*6*˹M\"`)F1M1>h%lJD%Qs \!P ?`!xd|@ Ǿu|coZ5;XQOAȼyjV">gQ@xv/z1|rz/_(OUQi!?oz&4 [W'> C"ŚSD(P5PG/ft^t5JL`W#bUoS(Py~3]6^Gg sܪ>=Ǫο?J B- +]EP[5 \9J"VREr֠iຍq2V,Yl<ᖦL9N-)21J84qŖ9㎃"W-9r[䷝-SMhuIaO(ta5~3 B ^.O@?J^2܏azȉ'EZznLØs2KOntc[{σRUM g8_MBdحcSSX@?Y#\q׫)젦xDgZȑ"b@>&n/3&ůXrWlIVj9FĶ⯊Ūb*w-#K㢧kon׿ƘV_ (ojUw'\mRBUhWrI$ú%`3N tErdHi8C=. FӻshY\ܚ]`~XӜ)%NoWtLsOZe5L - %Nq&lNZi^זX)K" s-Eh`MD#Jq&Gs+sİ0n`klmU{dN#G;nJ%wඔ}/e Lfx djNUnkw|Z,s.Im!j>zbRZ܁V,cs܇8$2:^qk_%1O<z[ 臏;2#MP)n?^Ɵr5L=dWbA,4.z-Iwv(;@#;-*']U@.bgWIԅ~$wD+DdURpRtqc峍kP;%1oc4o"GZׯ*#W:y#eF6Co:(/c$s8"^!b1Cʍqb`8rxn63Ә8ފ>9\}ZѲ%qkCzm5J >ݘwZ1ڮ{ Rtԛ 2'4ӃY~ga1eõQ\hYPiD3ZC0 5xj<4\6T-?gsD޷;:ZIwsųe7y~9^gjǑNa5[c.4˘WҙU(d0{d2F}|Q6N-!-֘2b[S:ɾT$#}⌺,$OkP1MJGJ=wV8)o{an(QVh܍q̜mXMotƬ&9W"md t@@ ?~]CmSpޠThsW_Po)pm u{rkL&1R]D7 ԰PyrdBr\P(DwP ϝ"Cy>\b7 ~q.NJif蘃AnEc r{aUF6CeT $sDʹ*ղ]wY6" L6Am,jM/9K4*TS,~>f[P AD՟eIL0 &2|`kA}9fg,3L@, M1V91\~}˧Ejܺ(uEw kO/FObڗ:_.y"LDpP?|7'򌎿8WLp4 Lp4)3m,N4Ğ8\ O4tn23/\S' g@i3q2H0q)'^e\iYE&!3 џILџ cfY0B\pD h` |ucA/]ƗCBp[-é%peX,o.ePK0b^aM6 ~ؤ쇕l,I[CtNw`N%(8,BXX(@f*Ga-ԩ% u$S[/+?7Eh<>^/&4Běd>~,hiѱg[@tE=2㵾Hse,2ݘq" x,!̫iwGե`q//.Xgr/ӜSPT;k1B9ʴb(rC9[c1w;,"1[lVn:1WcAAXxRVW*K+[J g`eA]r,rgr Wj8I#xG Z[RV"Qi@XqdP8Tf GQciI)E}/K%贷vOhA ՒrTzaˑHR},װݗ2PX}ю)Z%6J3"Gl&ѽbV &bqKCn.6d)K pbFHC׳ih}LaiZW8y)T7"$Fv `ՒzH!`$v+&3(VՖgk) ")I %ΊmRX5&`ޏ @KV#>dN!l_¸[UG c rV>Rﺔ!˵@5 *-)Dm`zxt5[?-< e9hdy%&=Oe&)+w6>8۷̧ͥ][ε<y8͎̼QPas~QA]Q>L3̯G&¡t2G3p!z҃G-ySJpRsL(u;+viVϥ!FC 2]DYŃlͻ ɉ#m:yR PhiuPTOMAJNnG`x ޛz| )D$J9ś݊(QL0>ݤMYj=*Ԏ,ջWlwZ%*xx^ո)=1dG -2ƜY ~RPEΨ$q%B2@aq=Tlxpu& pzWɸ+$;>K8˵85@cJޒq3dSnϴ"7 :4 :Qޒʉy)::z}l LьQeNQBrV:8T Csq.aՄղC]XLr^S4\C04gnλLJlQ@'fGFs?KF2> ] `Cq5_. %Zn]{^|8֪&g&.űeއׂU isNZ錢 F-'.mlٽhQ% E4IJFY)qD햊A褎QGsݒ'ڐ\DZB)HU}LN8;^š5DX?x:Ζn93~wv<^Lr0ۢ"'ᔨ^B'"` ן٧:M+X%ǼcS % ]"Ю2@9؆ev6¯:sevg\23m BI]D :kDXyý ~q|VcpXUPPvLs3Ajk({^<;3(gyF.\EwcǨ(7 WqKF=G6Estt~2hhh!]X:1nwl1&j_ h)>bF㳚β0$rt `G:kN'5M3N!)I0/SHZx HY6'6೏Zft:LޙfQ}v޾1j>Ѕ ,$cDkڻnJ;PBhz~OE٥Lc=KJ('!4L F9/˘[oAcb7<ZH թD'.erA݇C9fXFYgZ(뵴Tb#XK榳*Wfh)db҄Op]\Rk.91/2jrSqN6 ji=+}]Q%:?)9&\zX5_XOBl]^9_ !wgӻ< lxd rro$كfgz8_!;)*{lȐDRZ-YDb0U2@$m5֚;"OKZƠbt6K/\m^[r}Nbx}FgHz>ŧ `C,;"p(aЎheP0$BKKJ$Ԥ% #xbJR"TpJSmN:ʢg*( O@ S C|jdId*Ud"DdDIϙ)=@d0F Gpyk/I:ꙨPIZG*Xw hThIdbہxsC7 N% NcH)O~)FDix $09?AHl@6{?&"b>0;u0"4@:%Yߺ8M/$= S'cL Hv>2\jaEd7yhۈ3>3|\ޯ@1g =B^OscG͌NzxwgіWD&9` TQT/v?>A% H ^P/_taR^'QX-W~cJwzPkq싉l8>zX ҸG $?=*0o! Ja]3XqGPx[8|n6onFtwvF,m<#L ݙ) nicUҢ;oӶIlc'gJg*=* -ujruŮr®׼(uƮ'"EyDi-0RЮf+7>w.'fu8k~ǥ_,?Ui(ى`[֥6UYk cRUm#Zl"\=n ]npI+fK(o RR&hzQ3i wP:)Рڹp(l-;)2ԃivZ@{4^4)Ciiƫuy1B6b)39 zfx@rW}9xz%m9^M>. +g84JR99s1AsaGkC̟];}Bh`Rե-RMp iXd( `{!28@)%HNcH]G@S53] ɤ6 T[Ti9 -6VRN^,S-5DTソ뛩K;۹*wפ7 ʓfzk~-iLՍ7R 9 0A9 0XH N$ZsO\ mr:6S@#_ɡ$,'5u0C]G_n8C`d6tO?Ҥ_{ס;K.:%n#Vq~#W+}֓5 ]Mٴ71CZl#wKRt)\# HVtHftNߩMQ) 9LӇOF HȟsRsGG,Q$(&<4M3 ^y0J<5+5! DPSe18QڨSg<Y 'X ڻb ŋ$ƹ4"ii$e$DH7¢zYlJh6 &?ǫDAT[I]wۿOHK짃;:8*K㛛hw?|89grJ>᫛kj6Jnˇ Sqr":=UHK7փI^zIy4+Sy`R7U1}g9=_׸qIpN첸[KRP7&{gd&s^f̀K\S0oM?G_?Gj,{ףw}wFU{}}߳/ /KBEBSM;n'{CKj :SdN)(wzV:CՁ @}2, bKt^-Ʌn^pk6l^v>% jmڔ*5 _3I*r~{wũ6 /oRW6q@ *塠\L0<mT*ӭMNaB1l JڜylRmMy+OT{(<a;p۩K9knr~:kHMܱIbcPR*%B/aꋤewBuZQ1N qDWLOX>Z'V=xŁCbk} ;L>b|W";1ڧ)X KF`g;,n?m;Y|0* AB xNI, #[+Ue큢CG}2ǻ4Ğ}k k)pǖUز~lB-6<)к %fr_ܪ5UaoeL!YqFQYsv⏖&C^Z`[^{h؟F#Wpqٓ.T6sťP*o\C^&'Yv ^=Ԕ0GQ^f7] ,^A~)+87&qSg}~x.rOiv1){Z),Ϳ]ecЎ>fO)e3r}J=JFOWr6f& A+OzoU4O]Ժ>ʆHr;=wh\َ5L8vuk-B\PSX!Dm؟*RY<u![0 ZC|䐍+DnLB}RuͤrQ'y r 5N6Ω c<2kSG z1 $(;6Q'ڦ.ʂ4C)p &V׿C#ҿyb#Ġ(-1@cq:ʼ*tQtD[YNHN, Xe,/uQh)+G:m'Ң3!hASﭕ(8 AΤ{XK"c 0 5* 607ASF6^ "cB7*t}ecC49&Qc$\;c\b uhbd1Q#CRY6$0ʍOky*uDl &a'@rL(EBLf.UVЕ:*q`yƅe\MYdh!M m ըf*bte5sOS)t+@֙5!h%~To1&d$HP҅ 1T)r+O^&@&潶_:dz@{xKd;.]"9W'wZ%8/ [Gظh7, hܹ2Z_a*4L(]9*;>> rļA÷nv7= ӌ~aGRF +vF o/׷9{*M*/)'/o.j G3+FbM 'eg1~BnՒR;`Vuw`#+"d~z1++)MU̽]umr?_$>p.Vo}NɉZR=u_R #⪽i{6˧}F :p"*q6FQL!S\ q'cP#A'AԮ\< C)a4U̶ !(j]$ CRE8G1 iol y b,[gfo1 I68˧A/WNM\NgJK_{b0/Bs:]sЗPѧ^۠;ERvo)Iu]0MUO#9kũ/嗌mP'MC2DbԵnQoKV''J- 'JJuJbTr5]AVKiqFjQBLG&Of)@&i}Yjp&2'FD碖~iR^gy!'NydC)c %ܷP]뮒.IU~ӾI8o3L:0cQ_"6QQQ_ŮQS+~+*J<{sѻ_>U'zpȥUɇk\r\߹U)%mN;/$L|@V 6"iyǃJ.9+d|~0(71N2p^)MƟiW'"Yx8L #]XM'۱K ~sog8;sKא}LYf\0;D9NDe ^FCɻ@H6Ld&T6F& 2]$%Z`t NyYi4.~In>^ LcL%YH>- ؜c4й a@y \ nPL5HFn8ZoT:C#^!,Rn_34kELOr]Z࿅RBRa 7T1ExWAՓۣ\*[)7)&M2o~1ۊi4Iz B}=<>X{>^W=zr]f;o%Ϛ#}ǯ/O__>eէka\O޿|-{?{O'Z7hY}]je#??tU|+4~ZP\Ajj&u2 bYWR:ΰ@]数fPM{]C@eY9|OLBXc#ϣ-^ߺ~=6s^wS;o AFVƦB1--r5\,\~Fx+f v*YU4aQ%FOwO>;KC8oE R4/ #jeX#hʶ,ApYOR}etL\VJZ!|Y!8 eeX}e"j?7e%zo-_|2"kO^]:ٰ3!9CWӖ6~H%!)}!iT-ӏVu1o`V $|q독XK?m~iݦw[~7B(,)=sFҔ:V0Aԟ+o*y7lwumfc11|F"򱏧)d w=ٻObޝQQݹv_7dm@>ܠ-.m דw{*ujPpd?g@L \0E>B-WirPvνҹƓ:k h/r=WSrU*I FK-I{$Dw& cj챜л< c3tNt&;ft@OE l^dkPfLY\Pwv̓ʋ˓yá<D0vCx%XۈoђJ9D( %籌 Stk2!=  N7ЫJ۪ɇ4- \uJ l B4him`_2VIp%){{ JG_IQ}ng";zcI$Z) hY1Qq8Kc8txdȹݸ;eC;>]fWrj+|AkE۱O(=%GV);]$i{D50pQۋ s6E~(; I>/|{ɵp,m7h,0)hFFqaLBqu~134fp 0"3r^],]ϡ&#\euNmϥ.#N$}Z]tNBkUhL9(E:Er(V8ڠz {b7Q4-9ho,t\bDOHpfTҡ66\ ScjhڒIY[ Oqm0oa&üfkpP"&sb)NFyi%.z)6c[5SVj ۝7ΏyQs6ϻzg\g" 㪥 +e^@+ !q$²2RmUNF$Jbvel0vp3N cPV{(ťp8עKUCR% a>R.!>5ENp$雅MPώv_.ֆH/lӑ/)>sډePe𺣜Ez04!2g_',l.8ˋ x[e:N8f-Nin;N1:Fځ訝8 "}M+p!8uWm戓 wB>o}&My[zM`TR`;bҁ 7OvNrLR$1̅]a=Y vfA#ϾI,;=p u@asȭ&ed.I:ràa+MG7cBq/F:;stBCy݄* (ƞiیA{lp7>*x4x$bb,| &<`*R ıШJ1Q⠗QSKW | OeQ jxn^_G=b ź .薾[^. yYup+ec r(S=ǷJX6Uj,,@>)C.a m|>Ds ; !A~8updM3<Tcj5Oj㫚~Y[KMs'WM]k{PO >1=mNյZj1 ECJ[yteڝ@IIr#l7$#LcεTl W5&ӽp01B9({Y#/%WRYge6@3zgU4QqOnI4p' Vr xl 5&( 8 PQ U&2F;^ځS+[Kp=U]1q6^X2g-'2@FS.AI.j{l:FԮFZp"bFf;PI^93:= W +9Eqﬕޘ~p57ښz̶}m䫐3sOڏSӣ+s}5ШA;O, f(ܨS`J 5=~t|oyfs-Zyб鬢ɍH^vm/9އB ZH1Q'9\0)f503 4tg}GUV"VvcY +1FeԆ#4oLĬ:#WQ\",Z8h%45Ntb-Dn>ޚ{Zwqi]gAk]7ou6Qh]7(x 2"F)3_GW0~}KbA̎)f1EQ<Sh+(!#@ttiOWs;tv&7aNSa[9ɡ|+5T&L Hds'7B~8Oo|iKjT&jy#N1rt 2uvIc5E=>]W01^U|B>N,4hDUUJwnV;Iʻ@|'wBVH&+jHΌo ۵͘DCuB&k@0m1LM׀O[]+5?&h嚱Rc?6vke Yf6LMpU)gkgo x22 61,e"w53uuþ?LMjvxm5X"s:^qơqĈHA ZyRp (/69p$W-7@P0hWnfڷta6?YTOuΚXt b]^ $.Ǥ.dYA8zS8T.3*AʤK29yBdj-kocEȋEAbΡHhtq;vAqޥ}89DG9G[pDŽ]w~DYCCڅ8ӺeϞQUFmN%Bmn]NBg[o4f-} Ut BcԿҿ/ V{c% 1 O _Rb\ Lď&7.a2 85M~$I^W%p7y(p 2"Ed 钖Q>#n@HkeeL*f!:|4%f@ A/ɀ!i.fm 1-9~ n|/ɇc=U&rtrk - "&XdLX ^97AH4yGVڷnBk Yߛ@'%?%>&*e&dH3:m5H$LD$#GIb Gm>0"hzC23l=zr.l&a"46qj 3p^o4ܻ1~nQ?5e,pIK~\{~X0)6e*WK/Q ZTo vb<΅-X!|wgnd9V?>DJF`꓋ן] F!D`J_>BRqq, BV#'a,Q7m(yT\Lx!v͛Kܔ"˙GXa٥ITZȈt!3hZxG;֜%/k|S<\yG;xqDʃ:U\ d" Ym4 gs{_NV0F;2|+.,hw6Uh MԊ^SV%h^ G+=nhB"-!Iz2+8E pREkǍg4'^ dFSX-;.Ĝ+#쫻?TُR z_o?,7ϐĻV9_uü,n#ӓ.ezsc/ǿZ F[ioԲ }FK 2RYs }hwjp3+Fy6oK]k2IIsm9ŇFTiG>-zGy48["m-_ٸ4\z11=f}̼0O6-Wֽ&0yG¡Y;re,%lv:},[or}NY8igp()'|vOFL1J{d)Fi$e/&ljjQ};7Mi"S]٣iN׿N6!$s$I&d#6VVBw[L1#DyL9ea0738sfPymv_|L o^/ΧjjNz{ȷ(L O9텣H3 %="?VzJ%Dž>,GGDI>M$QdP!Ng u^-Ȕ8i@@b$P)6ZhzjzmvC^GuSZZTvNݞ6J=!Q͖<ÎCSW̥v;+S-8;kεt9Bƍ0GtS0Ywﶔ)DZyc x!͆h.8w_^, ګej1zca@Fnp\OC}T(RD=w o!Lf2xQ_ÊX(bAcY%!(~O9[+L{ 5YZo [G/5Jr6yK*/nz|Jf+}n bwt ?q.~,>3Y6"ܕ}[^ז0_g6~tVNOVvЁJ /Xr  *6㹆.4-3^R *wUj-۾̣xB?f/0''H%:%c#RK7/CS0$K2>bGp<7G}vtfTR*۫͑R猥st1D<@%F1R,B BA(4Qsf3C2q$#?f*vzuxU-AYT!IF"It"Y "4Ei防)r2Ad#"G] n°\c1,_Cr"dNs1ClEj!kFF+jq4LҪGl۩ZZ#MGl$w6~ܣ_AnфP,ץuBtڔMNe_ ˦A?>Sg;;6!hs0b䪔e\|ۏ]fI%B;:-ki{!?E19_}ĩ<-E38(|&Mr;J!KLug9Մ:SV{ڢ\EuQf0nN:$V@k;"V; {8d x:;`!:BLcD`Yv =̇H!NZ2w/%[c68BJեBC9" -*3YH-93PyAFȝ"T-Јq/6ܸkkD3)6fI~)aʙ6hk\2 XQR9Yg cL,Ah42gM5e-ngS );g13r6nό?IQp# ҄9׈ e1N*XhPNY4JUFuN)>^rb'*oY)zWOO[%^כG3#s-Ǐ]˯6{1R&~_u~wq5ebZu~Izq_ C9oO!I5*$tky  tw5ο}gŃz^$@A0{3z&ќT/q,]C\5k̮fM,A xbݲl?˻yu U|z Giαm@scF 7NScME敢塘a\UK)~15APJ z}L"ΑwzBi_ABf-^'q,gLU?/FWF,ҩ"+CGRJ'eN*J%ߣap)>PP 8~@ h?vRZݱ0cJx32*~vlRt0oÂ]1o7!rȑ"G~r~1" 刂{4LZ%hYh2VeȌ sn8jxJ_/&$[1ɷw}d_f"$g]t|,wN?(wC'qAoddlRdN&e*:ĝ̕ *ys-Ns" ."gB4ن#y;Ր5Q{sJ^{8LXk,Z'i|5sΏ^8CZoo+x 5tBͺ.N8 UQnNXaƵRڼAy}>:-dV86,*`/'ΒqjO.fduD +H+"x2»"fq|e"~2*f4ֿ8{ʟn©_lʒ]eYYwQ~<1o4;<7hǗvfN =ûl%(ġ騝XkM@sdg[$S+RoPej_ӺJօ?[#xE,japZ/[Q֩T:'#g"7`;YEFE*Ba!$ #3d|HƱ+B:8T ֟^jjWUuupf;kNX.|r"T*IJlP%Ý V̪$*c4V VR {3c *ᒎ[{i@z aVp5z>_#ms52eeVI4bo3hNkIUk8-`L  4'rޔ?N6fyfc̟^<Hg73Wh&}Z-4_Giީ @7@ոS&rFN/fI`mk$nDHb #Exli N0@^M\E[$F[ljJi-l5^fP,jm$ȡcjkPX‘k6L ̵ub*3઺e9*a0 a @}Âtak2/&ONz2oo&p_5јVhm\6ڤa1[j )2ȌzA} `ShD+Lj+Gak]\^ 5FQԗP89\c^ "'+IҾiJ$j8d6]mlSJȈQVCo"L%!ʍQ^=^MO>^҇{KעJ/cg AIal5}rӽa0:1AQP9^؝Cʀ%tXDu26\z}2&Zi;TlHƱz(ܼ~ٮzj"kG`CUj"V 5@m^}M͡ŰIr@yLTVíCM2i"rj[YgDFOiyk5N+דuF1)v(#;cOrc4d{>\dHG=2o8d'&m߱<">< _F-Ҭr+}_9IeU1wn鷍/`K| )Wy83gVsāp{ T7i*SpZ&^(KVqDy[1q3Hi|Z砾܁Z>k-1FTaS\L^qIi) ,D@_“R@Jg) 4?<~.^dP_@ qCYVu"#iS<ԗ;PC@gڟLjS]EtT+-t:tHB:Idu`ҩO3XT8[S#y\ 7Y5ivGPF+ q9/yß8b&q1{gWʹu K"_3F|c%yܡwO\lP8eOŸ́SD`QeR;#4ѳY)bZ}d+k+&r^DqFi'zpK')A6 fʅ*H ^ ]Ep<*tE /{tTx-2iB-߭z(q1zkcձsS#W j`a5ӡ܃zxtL8@ghkhXq@ڧ x`2J`σQ &eNb>#NWy-VΆsH),ugA3GZԀ 2拌@D]Hk9}j9@'vrePAxЪʘ|"d vfdT+h^Pk 9/#ݛ7h]B` NgԊ0* A]X-CCeq"#.猩Da&:7yKqi4P̛CY%R6j~YD72C "sUfYR\㪳,|߅?\:X%I EyjmHR 5#IG'b%< &mE'mq,(GCNnJe'Dh5Rh3R&FO^C(/T@y2O)={gf`+R;Rn&-!-_X02'f gښ#q_Kr L㱝$CJm[-)QG@@ZW5R779i~v^a^YǡbD, nnpxssV\% Bgj >oI\Ϩ?|xLm `5xgX_۪@{rtӤF' !(ӄTiY\ 1 @eyذ*ۂ|L2&#=˹5X~I6d#A 1!sxNAlc`AjR c~}8_v7Պ{.jޠYl z?&i\兕j$Ϗy˺~ߐ-^^@5Bj)dZ)TI+NpQ'rz/nOe\ܼ3|s8^h4"E皧ਐ.Lc5EHja"eTw]6e`4UQΦ1O&A WR` 0;BXkͺ2֝ ck&?ӸF`zd"# X+"H@,m0Z1#ɫ8#VLTHe+Q}` /!ɲ9_Jfu*BkYԊJ[?cRaۨ"<&OHdozoz| ?qj<%O݃g9 pgGa>/T1f  ܚ+rCai!5A<8OA&K"WR熛"1Mi9zv5.71ZR;t͐~~rx)AOEjH`ʰNL&Պ˩ x8O`OˇKˇŞfoSJXQ|T A&zxZ;Bӵy컼XVW\QSof*ro~~Y}YU`:ik}# <͉>>)T؛C}gZd} gBE5F`kV:)BC1y2rT8a f2+ 3JQ9sزRN+'|,y8ΦmLs G8"6Ȏ0y.EHښ\VbOeW q"Dd !rq7wrtK-j^o N-aѾS~} ǔ;E߭Mz9wd_}#*zGDo5Xy=WůD16=jٚ;F[q9u<$sb~" 2{DrE#+E΋c+m™2ЍdFn['-{d;nOr"8rlD z|u+PVyi@kZpv^W8)^I>@M/GɵӘ} ٠R* UV*0I$JqFz^\O;|΋眈ZGHyOJREFLQ Fg0-r.Ua~#CSҎrJN[ L 6 O6I=nXYﯱAȊ* p 7(3$Ѽ๱( je2g7cẠvc07j$=y(疁؍)ʤn4&TKsq)4KL)&ДifOCY`Fa9ϣ<<_j]xUv7iYZEgJte/W%s%?Ϙ @/W8Wz?޿= IWdwg֨sZ">8~ǙxǷo5f7zjmO1$eĜrNlvfw]nagCEk}=Utޭ/4ڸB6"3QY `Wit.o/F6=YfߞeҚeyr~{%qَx;Ϥ?ܾנgqְkzH1@IhFޜțZjCJDI$CW&gR`Q\_5V \[g9 jN6%pT}w¯P}]E1G# yVX+v==?uT^'w:=z_ӊ7u9:ֺmHkm$|dvʲRl0dʆFo4j׈Ӓ93x+Cx8C XzcPT"us箫, cQ䡒nP]+qgxMwhM՜ñ-e W;@AtG LBJ5xѠ#V Fl X"Ȳ./'3]bdqUU.h?V_0&_fÍc\sh`M2=]?sKrs/ِhUu2EW;ە( ܴ>zz2jW&b6U4H~id->6xї[x1u!p )# #ZPN7X#"Lp[x㘧u!pm)5m~}m>oJ<+IY@xV*:mX&)gu")AAk{Q} iWOSOJ r(겉ҒjT=`]6 I(C(<(<9} ԅ$_||YH Q1;T翺q[Zc\L$4RqElKM%f{P.c8t_KȱwcGXHH"$XN;{Da*zF#p2ơ>w;U;?U3_>yWܥ0Zً՘;71+,D8'}-ٿqVT"o0"#bsWCy IA(Iy 3F\qR4yq >_ 2Y(%>' E w|KrPe<`/~b# &oĤBb<1 Г& KF%,D3m8.zƳQ,qQɹ .0'P-|lG;t}4ie]8 `TVWCHeYnbx,˄?px 8 xC;kj*CIج*׽D d&7d(D+!MhPnɈݘr'ՖDD= zh@R0a,W0%T jUhPl *rz sYyF1>qd bxo,H=B2t2"2cda4W:*z_d)Ι7Qi&JD%SmV=Wf Ćdy寞\Ɉ2D:xg1|̠. G?hҒ9̓9lNS,TLI;rXG=OS]Z_,,ꇃ8+R1,IDM='ܳPKuIԞE\_S0eS`= %,|] HQxPQm-2F=iJ#ZbBۋ Yh we$!zُH-#p0/HS? [׷\<x.2E3MW8jIG&Q:EInϫhύܓ0$M"I C?WXlFIVn9A4$aϊj",+#cVDC6 a/^ Q{$]kjMVYyT 1LSI2GY\],Ub͔KX<t/%b<+V*,,Ʉ!FL/X<fAt\먊L|Z m^Y2zXik=ݳ8ҷk̚BvhPx;ޡvJ[t)j-UxP,+jQ7'كRa$F%VA06LN؜s]G`?11j[^ %OHFlmdUƱ9Cr6VCb_;Q떅L4Cs}*V7EyWh5&`HIӑw3Œ?xϾ{@p)/cX[o{#eݍ;|{}N~T?19Jm7ٍ-tu#Jkkw{4FES`-=T>;R.rxxI={rnĊ|8zۣLc2Fuۛ#C!ld!`yiH9!c!Æ,a4މ`rN@ 5UO~b-=}y4E8DG,PzpI&bi>jYd>+ǧXC@I#_*q3aR2/8/E"+17Mn/zO9)qB,&G=qTåQIBlN2!hp=<0! hF-w>wJKK)i:6K?GYre77r 5ϣz*^q0-UκY1PW*s Wjg sog)GYxQpR%axYU>&8a`aѶXc)'GD`p"*v]mI%!K>rⵚxdW}DrQdh^h+o o#fŲϣ:Bg&bŊeu!uS'T+(Յԉ:m4w˲ 4]!YaFB=jQn4w0 d_I*#E} A6MxjTtVIg`1CN:EHMkS* a2+kijJUt! ֶΦVڱ#dv g^'hCJp:j Ih 2 kY''l*'Q&GA!=L]سPa.:bFVcn.vBOR_{?,(.PAxtVDlV 2 PĚjW)5CO} O,47-M׻Q\Y}HIciSon M -um͖ wnڻ%-q»%6tnDi" Di?dW`lah j WOC,[8jύ0儫88FF;9 -XcNyE0Ґ>ܪ U#:ܪa`-6JS~Xq$k|sO_rN󚱠)Vrުۜ$Rxq^=;ehl^hnնGo8ܪmoN>{]^g)xEƒF/6qe,ʲ[c1چI{(,5} C!gR-Չ\-SmfmaGLܥ5JiOqhx Q(%9J ep`phl;b^!X?,KSRGXPxk-,_ 45zUgpet=][uVuOVܪKVu0&[5hGtbU,CV uħ?ǙN2R"m*tjj)Nl1EUMhQtŵ v-K6+s *{`ӣHQc"kVPQڪ`IoAjsa:EӺl:NANPn]};mԮ?r#&x`~u(̛%h/p/ &n 2=j[7CGW}R 09QV_@{uVh`cc4C.Qe:jݶf:uXgI3T鹨z¥5 b$ #JkgdR-*#șcLmV%E9ɽ4m_wxĊb= ̌@ޜ7h8v<iܜ:y&!>`y)+ŭRjvonT&oU&N\66˸ȡƷ_HfonwOMſ~~~~' M̐M@'\ @kf>ʇRRԄVIo4ߚ݅w|jtp0E..P {ۦ8/;/vrKuw]֨um_%׷uUitwub)ݿ!t~#j7ķ[zF>P(G9QV%8L,L5z謬 k˜B45 T*9S"`SiFFXPx  8rDzpx9B-HluL=UåV)(j [- jM¡Kq&\ Kl5zucX;-_ ~HbVX~zpBXI|jL VNLlE!b2lQZU3.;靡ybOxتY.\|ۖ~r%T/zn1˕P5R7:ն3H_Q5d抒smeUئm#MEko jm*zUʄ 嵶2I˄YPxcsư g p23,gHWጋMWjs3;c,ԣuZmB=jPK)M0m6aaM`$&1dP٦mư][J le/8 SVv]ܟ|k⛯U:XpϻO߸6uмL~ozSwWe{n,V$͟Kha<$$* k$2ZhM5WGmr^Y7Dݯ}q3w Пٓ|]85t߳opRW[/~e[z_sHjAjۃ-v'Z4ARkkSUD@(Y9I&Hl ZM>g>heJjR65Zp1X"6,P#z*. ʬ u(Tadc"l TKN-0ʬYSJTs]{/:BͦQls{ZȲɁs-[X[ Q;s9*˘*lSs5q˨mTPU&UG j6۴mSM0DU<& G] ^}&iRW k=}Ci|'RWSzB2zfʈ"?} &]=k n9nYn~hU;~Y^j˟~p+rj7XWSu"ՃO}Jr.ؒ/xGe^} `|Qup;2ٻƍ%W,v^Ճ$A&n6gزWIdM6EҶ&@0#QdWŪہg fjYAAc7k=*38Gug5UK]Nʫg'GISq@Dt;!<ҖO"4 l8r7ǓMDneB Tu!' "v)rms"!2Ψ s!ZH)A7"U@XęmfDATQ2%@Tsp'٨0R2IX2ˀQ[Nxqц q`"X(!U$j9I/LEeS@A#tݲN,g^<+UYXr0gI^8LbB]Z =+A%y:=mmӽC&YA ϕPygGnP0V& xda :X Lj s/U2L?/,t2^YjU7Ճ|ڜh ~[6b gbI"L-MA^߾=c!ò ~z5x:a,*BhBoB󴑷eLpNf󿭵#g$; MCf+,l<@;UsFwMD*vL~dDQj*ĿK ~ޅTI)} j/afL(|AXYCWbm!!5wa)}`j !HcA$:Ŗϐ!GŜxތ^bJ5+.Z މxbߒ^]dr!W}#YO 9m&06-~"Lce`r6G//L.O J z0X]A`S4QDdLFOK}PҴԇocI,!R65a!eHά([[H0S.sڄ/A@Ų4 PB5ưͯCpOiR>`p'D207 :k1{ ? JC͓Ër[Z:sI/QJ? )!g!@!Q~#Gzh #<;]ɩy“NmHUb v H<ㅰ A:Ţ=FVoW1JhqO**o8D^D{w|j|`Z|d*Sq~IPM{ A4/h^E-dSP8 \ˌh(W9*3JIεERH}tHi#]㬽 LRujF#E0B-`-oݙ۽*rnw#p M]ccT_P˱T#37ep]|qU W]?y!Ƌ6Pi\h7h EL!h =ٟԐJHo]AV]?&g=|Z|OVXٿVw89Z+%<'Dϸ%y}]&)Iׄe8Wd<{Tdӧru}QV3`r/=qױe~/H H+QAxi: f dgO<W+ 4}n]q%9!lEvO|}={Y& TEqPR<&~;'W2P%ʁPxQLo( Z{Oh ]]vU5oT{nf*.պZ%_l~p]~aAo7[~佡ټ;Hb(ވ#SvW^Հ{^ol2rp An^b*! y"$SLNWnԵK했A$>vk(@ݒ'j$䕋hLi{sWA逭C햊A>vrʍg7Fj&$䕋hm ML"'(l9)H2lO*iVM~w.n7/˿ yͿ=;Mϯ|p9_?;aB%`V?AE]elWFK~t|Sw6Yoݟ`~ 'a(HJOtV1umdZP%e:"uL5hҊ8)gE j'L7$u W=F6ZZ擑_z\Jvv1bc7i s=1@F*PFD m[ܺ㱂 ikk*@ BAyE_RHSCaSqӋ;JV֪:(Gu}EyTB|C][G]ZCn˚YWWd45_tAxu\- @U'jFJɅNtj=?RSe3b:,:i(Η9y|"};䆛VQx6=w"f*_2ߓV}= o4 ~jTkywtKRlBԑ T'FARX*bj!_.dgTɗ,i-_.̗kBB^&ɔ]V(R%b%:Ϩ:OK֣k!7ڭ y""S ; J L&*xB{ j@B^֖)IP E - (Urڞy9bD(hp#.fXFnTXxf5fo!M#x2w³lٳ"uq\ǫ̂2fWOQ!0Z.DIhQ0xF!R)ld5>8CYE]^q< o/ɸG'nh*ɸGւ5_۪j thMARj d{~$x)]]q]@GNrΗKؔ2˩r[(Rscb`eFsd+Ɉ{LTQgkgٔWg )TS*׎_ 5%OIUuRbxt˙CE3T :P,r5ZM+ P;jU ə@Y2C)qPJRB#\7p`85<[ov6CTil_?oqj/(E:dWAk ᓍJbK^~l, ۔J9`ؖʌqAy" aMQ'3~H #eN;)jH(mw2j=Q4MyE԰JTkAk"`S}E$S #< fW%KGH !-"4#;<2/S 8M<2"PM>tSGz!G TkVJ2<&iDq(B7HZ ,tm\;MQ}Ewtӑu< s@~)E$nrMerrcދ\ƤRBWY&^@%@jM#&v 웞zӔZ\|ӔI1}*"e2TͲevtL̹(7J,"RK!a(yw`gyw%܃LQl E|7T)ct߄| rz (uMn(L雈yJA3 DRjMM.N\ˆs gptiyCj?2A=dMe{#0Ft nc7Y'("կ~l{}窕B.<4SRU㭲0 \$5RR 4)I&>_d%:ɃR0cBoewv*dþlmVZ;fqFi ]m4dYcŝ4T*͏c-dnPjU~Rv#^j#q@Bs 9/" (ρVh4w>:pش\yC`EV3r(XY\ȳUB"]άՉof.]h!gnsd|ϻ'27Ш.<qj< 6)\+0A7&9ql)$.amUܺH#v.|܇ݼ'ee{2V&*GU!#>P#WFD#Gy5HT!)R]$ה8;BaVe֦6I4*@0iJ վl2kV _%C&+9Geʃb|%+>\は0cV3}pLX !>"K+%W{t}W'c_8%btty7MR;nAo+;Z&㽱_vzhB`$xoN lu_u'MBj2l(V.9V77<{Lxo9/)uH4) FxpXgBv  %Z,c'hy$b]hc:K-5AfZiI">'!<*Y"A* %Ay PYDPx #(0* ؃>*rfiZ42O 1jZxyx2Z\g}XB?ßȡ+|68-,=Kݾ.,w?9Z΢5jvN}NV~9Z! OЎlRa~U kN|3:ݵ j%$Pua9a>4fqamk⯸s`i=WPĈ=i[bC9!ڨ9nҙ5BܛtP} aVSis{ik!0(iHlobTʼ(4/ ⡑CL}aD]~ NtC.LYzzk>:T]5"&"Wd?&"+봼O_UrUJ|__^F2䤡ߨW*4[_߮kC ^k%`U$WۗS3} D$1য়zO BVK~uuwT4/ ֖t}A۽!+dROυBx߿"VK]%v42Id֤ThMȬIvOlk²SŢEm2!6]tmS+ L@}q(éM cmLS C2L`Xu~/'5p ^E%1I=Db"FC`2r)H MűiLۮw8Jնn^'(Kzn(KTCSk%4Hm1p!HMxɻiGp4՞sjf23\RP8 ZWp{3^Ý>wA=eV2ȗ!&qZ)!l6t}=zzrS~ZZ(%mga\6|LvKA=< hʱ"(ǜ몿U;æ3E- }Jְ.WCԡ{K tGU6=6>o!/p˫-Ȏ+!KР 4&%?FHDD+fk"'6U/FmLZB꯷w)sŸP)U_ϫˀRe:, ɟzӔTOVMNL* pbA'&mO% K5MӎʶV,"%^tJQ_B#(P:W=Y"T12UCS̬,J-^EL+CYyV6ª%9fW fAUR;|NAA3m$6+ZAwr@j'%;@IFk`!;"3Cj3"32T|NAy0;um8 sRzˎ; aR;R;e:v%f&F)5ĨH,EPZPsXWk@4mErwvU6F9DNG}$j|Cai螤)%qה6&CxbJ͔|B5)'eA(c=벵q8y "| پ; b})_ۉN>ViY5EEo},Yg^*l izdV?U%\-St#]ύM&Ô;QB2!u{)W)J!Ͳ$58T sORdCq.smHeG\ܠu49!9³g!>Z+v?\&|9^EGfu UieWzwr?RE >ho}n7dU.t^bvAQO#R2Le'T72KTv}w,=Xy a~վG6:81axo^?ݠ5P4u~R?\}닭U3j}z}ǐD}}UfUB:NA11ǵrN1f骏L.[yץ"|4!EQSFY @F`%֡F j)f^@rOUAD.\[t kp ;!'DG,Wj>r4[ɘ)G,IjA,`z|ґ&QL]!~/yoz/ȣT21^ J,; 24_Z +iʸઌVVꄢdCNF r Cӊ*jlB,tW"ezz {|~\JAXZͨ~r4JwH hB+mPPݧkOGgpYSD7 Ua/6k:: xwX3.{4 *bzӐZ iԭq+b ӎ̚ ˔ 6!`*93aX]xÓudn`dͶfPԛRxr`\PT-fG70mC>= 87gsZ[҇eyvl ,];YnoSﯮ_#2[vo!.0ŭ>yS5(Ύ`'v$XmL<΁u^zOҘIBReJWSvR6(xkZ/o&mf'MK9"Uyh`ӎKA/:JZ[vVZ4ydO(_U%ҽʇ۳/2T,demؒ Q; ]]Çc1$KKZM׺\(Nݘ!;p^VRσ($ ŵVl'u(ñ :JQhxrvo0 KMjWEB>%spjܳv$|gS;Խ =ToRg[SkG٘Lեi9ˢo;=MmYqq Pd2Qr͞))xnGa,r=L̆u^[ùn5(ۨ [+(=mDE4 yu]To8E&$ /_>lA0xnjPԣjfm;FKtvFt@$NvTkB'@pdv̅2YdU42YOTC9{eo+-yX&j1hըQPC[;3׃_p>M)ќgjE5g1=v҉O7}L?o5OSdy 1 *2BT'(IE"t-Ҋ sF_k\+(K⎴x(+qLW}(QC5%^S#fgLM™qF-iSZJ~Ny&+'bq(G@L%%@FԗJKUѱQH鹪D /S}Au4|<6##Vm8Be7 xUgB窻P(>LՔ#mʃJUzDagSυ1m)VKI&SNI1ۦmSzTk5 TUTk:ۦ1mό8Ӥ$tgtnSseTҙ:Uac 8)]gZ=W `݇zNS򺸾 f}vY߆܇.$/w7挥f =DԺcP W#, ]xf^=aa4ivrҜūN AJLnqſ1'%m kt0R*_䌈 >p=-UÛww[b~w6:H#S*l6._6~ 1q?(x 2$܀ 2XliR+n )Yp);X_.Ow-FUPHB()QRIBĘ{XMМ7c{8Ԛ'8VJy~p~Ao}7|.K{%8F62 l{Jn(6&@bdya?.M1%]ք8`o"c|=6^2 9|j2 uGAE/_ N£h΋ ͋/|+t)_Ps :Ǩ6]qmcc -8q0J3+_5xÞf& aHqMֿRAc6̴8=_;cE'{Nqߤa$=${r{\~|$MD\//]듥;YRvW{RjQEq-m5jyҝ?M4OՈe/BE[xelUEI|YpʠhJq-o^x^2Jt lBdBTXM0+T9a4a SXa1cLRD cNWSa J/S}A5Mf4m♾_I&Ĩnm0Idꑨh\u@1fZB)ОP@?hwIӭǛox(nbĞo+]Z7o;q_5jZ{ BѤW7{4= @5(0q9Gjh<a cB4n^5m .nm ƣ<ƀx0l7WbowvV ..MpD Tk⾻)=h^|Ww"̖E?,{߃X$ݻA"ɱg+V6r0D$51'20|Vf ̮8!pl5^9m續Q͙5%'Q9.%'%XPt8 "SNq[2A8d7R- Oܝ<7ݫr.<䚾Inp!2fcVre1L0% yh u^ɰ%p٬}rO7>ȡOنNbt2`?owm& j]"".vSr!j\ĥY}Uomh͡#;*{pP2 )p<]`aӸl dőu#{!jGPFWCӟ>O!EҘPpdPT|)%ŞoIFڂ1b=wK)!ǯ 981p jFm$%h2Ď\ZJƜ7Az&2F0dnj"\ 6T&EMyɘ :X> :]MܻMb^3G6P!nN\ x$ȠZƳXoGKaKfNAj4Ws\FZRwfoo{ju@2sߖUY=}P?U|K@矟ߞ~~**Eѫo ssK}x QS Šf'_Ii]Ix3BhO>+(u6˿ SOQ[* ooW&+-l$ 6}jI:W+tޝ/J!n:S0MBҀ2.'JL>v DJBMd]nRwKYpFGM%]-i*;pKEv{jy_ ÿz֎ZI"#ΰd"Jl!ݳuLT|d=ӸN ] xmNxH[I:w{,Ye'bY~t<ŨEsx 6ƥQpU{ ??sjhz|C2aGz#JE X\)qEH|[ʍ)O=xOF:s;C  PTќet8wf&7FM?D>7 iz!Jo')QI?/~^$hg3QϹRhA!F (F>(R/ה)k$6;QC]?4SAxXO ج{DL>QkNfK793ϬLT{%@Gfd6K[S+DІtKa) ֊]cUdu@RJ BduBduu(LVd"[y[$ $Z >XK( SR%kQLJ~#,M`Rgػ1 HOV?Zղ*ֻ+%R(W\"kQ)TJRiNheU4J -YUzߡ| @dY%wIz2m;_8\ta{*% R ՠڏ+;}&|]YoF+_=$}C0,M6$}bFL[MEfǣ,eZhDkLY*صy[UtN$F4: LFI{Ml}&@OӞݽ(jܘuaml?jHqp{+ 5WɣhO+x=fJs/z%} 21Ur jr_ 'Dgςr0vv DBSAKvBvh!@T[̢x&)vo2|=U8Fhf̶@1Ikx>GWb'mϳ^IMq֣b'can o{=p#8{E4Z%+0@ {2LߥG>¹*a0?bg6xu!ch+Y/ubqV9ɽj>-tk&3cU=n\->nyTM:8OX|>JcuǓk1e^kMEBwrO'KD5W"vp]aЩ%H ڟuia`QҚtNc;͆ /=GG>BEkwH]ْ_nkfKn\{WrV{pKpp &ιx4U1Dz*Q^V뜋Uߑ3xps㪵D#AmC8 M/Qs ]@Mo% 7V{bc?] ca)v׮;ץ:USՒpvzMpLuz%WՔЉ,VHK{?u_h8VW2PM*f~nI0'P\ 6uӛVرV~W_wY&s2_OVlm1z̀wl/ko엧s?nͱ`yr0un|<{R8?N3+Mݐ|"$Sni脾G|b:n-M[hLڍsj@ D'^lniiRa-xvk@B>s-S%fRMJ7K'Ք_&;Aϗ#\Φr,4xWU{55=}Vk^{f?DW׿E$޼k;7Y֝^6nR孇.!̲3L! q]rJpOv;6 Nw\-B"\]<9PrGl&WwŖ=.f\N&4ZOp:G."=gԴ~_Fı-vbyh^c~PJ> /MOOfyCX|x~s>jVշxBp(M?xR՚"t搪;id;m5ue"t h}di[}U$*ՊD㌩- '|)ӝ?-MHg.A2%Y}&v EtBg{Yjޙv ݚ\DCdJ[{M݋P D'^^uB\Dkh8-|*dxppU/rj/֥*v q#bQs9OQ=ޣ0~0'8N "t"y;]l!?le,P\G{@Qfa͆SYmˡܝ'c flc˨ zu{8zZ=\_c"GohltMaD~Lz P0zi\:@\}nH#Tϵ̲EtWa\ Ὃ] Xf Ć>ĻNS;RKˉBú\0z}gᓝQj%R9- w~[/UԜWo`Vhg[>[4MmIlaX@t=0FHo.9)s2 ,QHF5]/ٽj%+yvTS%Kg /iXba6ukcFFPQQbǴJf{'A\\h=jZ g81\&[2W(geJF[͹Ud53h){`5EEWfY;z!1*]V]طo^|o= $6cTů$%PF HrNVi.&{,sQ$%i'Fy D׃Lswb yY$F~ ' ,Yd?$hCaU1g8b!>Cf9ma0q;=J\^&ZQΟgV H1s3D>M]]xo8D#Wv ]!3NQ{?YVۆiZ?l 䂞w-p?8re2o|;+d1XR a;(aDI:6) d2NIӔ4ˈ64Csɶ2?-0C4 h@3*ڣ8 nѡriHn!Cq$#bH,5gYT)łR#pS3Ÿ iqٿB(8r=ȣ8Mxj.AˣԩL-BɝvXUf:RL TO&")IinٱKT :~wC4/5o}v}'lOof3lf]'ld42a9փF!UʽhjBsg8`S"Nƽ /.Qo8F^KXT:Z44̜PvdL8;^TtjuJ}E9tn}E_!VW]3%/g?l2rmE/Dqpt5pI@a<ϋUv#S\e48Q*X4X!L$3$"(Y4=^Ŭə.,V;`P4ayͧ^&H5&A)21UBJ* M(Ռ$ ELĄl&91=XR"UǻHLΐN`aPb F( Ljc@%:Z0tb7@/`7Y#VHS2v#WDeDT&ӉH9lHR<%Ĉ`B")#Rbw"kn]H;QǤ/C"M~ ]e]`r\~ǯ%#1D 1~ Z|ū^u?":,_G ??X6yN'CcտҒ*@ 룫4Z_gǖ2A|HJ\ _f]v]Os"]+ öԄ]:s T2(_pǤgp(KsȠ% NXN )x>K.[mD~-Bf^mGvc#5HlPSx G@0`s lWv\:|T hN!(nh*k,Vh;FQ)UWY@$jYbJXdS )j!HccGaXVlM l4܍wfRߟfy IXLs=bouޒ uۉq.:KV" !p !6Pa|_ Uݺ`קZ!T:>eU\5M}m5;S[΁B̥{!cfZMhb]@>|j{𱶇WjuZq(\{uhG})ÈvrtnY<ݹ&~LI3Q1!\\p="9.]_Q-J7a 妛PMkutGI1F2R3k*֙e?'4-xډxKJ׃vA;Go엧s?2&UoFSo mkѽ#瑁H~άJ, Ig.A2u})v EtB)l39ݚ\DdJ{MR[(.vF^ Viզv RqnMHg.e IHgh/MHNVZAYӫCHsC/ڦJμp6y g^pǙ|3="փz82.m'옩TaiA$dFo4LsśjA^woa}gqKٻ6WzYc+*_fdñ FuW5E-x,juwF7ʘНWUU_^{/;ƯQ`] mvɮxfbU-p,,mU}]?ϟw'.o_q|ڟQ4 S:U cUS5߄xzf)7❧]A[{XŚ07䫤䫤䫤䫦OF7EEs ^Ku*º&9j]%*](J_Zam YJ/a,úɰOsVr&t~}P}dgBgٔF@dtNW.s6b0mQqj2 s4dLm O!Ӧ>koJ lCogFsCM$#镕?!*R%8*!ۆXȂNO2J8t-F[SZ꺊~R&`8$U"^;ڔ$HjW*n $*Ba떟GXKP'rKIIh5yH!Zf1gZE#2yvi3ݹg5?.Q-3!_:Ɯۊc:CQqM³3'gFp -Q>3u_ Q ZfV*n߹w5qҶwG4hW EE {iR ɧ熮[*U=Z([clZrTo^Qn+D{>G9kfM8{j8K R*u ds&|t^m4 $2Aw*vfa`h IcƼإc f TBi F) !V)eU/cU0e%BjKQ-f r>KKIQ RաV0DSȜ %eBI6|;XGoό.w eP(mfdC9E\lCDzs&o>7Ir7w_zjV;๥o-}V ָ;>|;Uv\h{_7`gV[`D$Z7$w=@+T{ޑ,NQZo:X࿢xzX8~~LS噙.I3\ko\4|b 퇏NqZn)%f,{q0NcC;~)?:-rmw=Tz5fSN+D0hN eJJf&+`T[jv}H2^A(:)=`N,]_A+:GuZ2S8iؗfe_͸ED0LJç9Ei4)?ޔW_%wg5nK=8s[* g-sSp9Z#v<ع݉j6Pk[9ef,Ah؜|dx66a߄7QS64B.RGHѦaS6޾ 1㇟~9m7wO~^L[S}cbuCFI< vă ]lt6TH0*lhFөZYj;`Qk]|.8Z!`^>."rpmӸCf1[fY=yڼ/,iltƾ7ƾ٧dؗфD]}P=ZApVxKM&&ƌNٗ;xaVViAc5׉5wO"܀L.GshZh64a)dHèpK2]DʁPV(8Xx<\F>t?&+t?&ùn􌌞:ӻ}Ηl!:X`FXV6kQށw.DJvli>zJU]XIQVOyR8x% IO;o[w١}q30~$K^eg1gO%ɬAI.RW/B(LFÕۆv{>; ul=QqjKꌁs@ 6eDP#|6oƶdf\N7OˎI ;& D]Ed E]Z)kI'lJ*3o*M߁QJ>#KೠƖcٖ<":UZT,Pb˕{ɜ!u]^z,(b&冡̔N9`Oe"#prIAcvyu+ըJMQ# FanT ɲo<= ÙA.m̤Cл23+ DZ?QZXΕ.~x7w >E?|uDK\vF Rp;xBg%LY'W&q 7h` ;J3-TS\)WvZe fzv+=Th|%T] U!L1 #.wo\?{wjB32;DfT{N!2Ʀ6՞ji<1.TMKSP6NF7՘8 ,=A)^~+6=oϿ/k?䓋y 10P|/(D"Z VP~uXCWyCNom5-s3O2 $%ˢVtM%WHQ\>+7Q#+A4-N٧;E4:dQH3*3T0=P5ÒoU3%ѳU=b 5q-XJîDނgMi}WOvqwtn::zlQ GOvbͱ ٫I9|8'[,2HIlmV*gz Z$ ;JǻX~龛uݥ]oeepq=b.57mK3^5$vala(@QFmm4CYWoO/䳪Mq2訣;S񩁕QA4F[G_eNغ, _;)"wP8>^ts&H\a!Tp(/ɒ1{p ժ,cE AJ+u-HH_|c=?O[G秥k}؝ $}Hzw%JײpVP꒪TTfOg'ՙ/KBoT2Moo&ߊp39s2|q)+WZX'U?=>ݤg<Ŭ_x2B͇vUQR4w-M$I`uxxʬm]mlg{.mx !$Q5}=R*H I.U pw57Ona5ߛx2hd:-c/Y4ܼwkhcցMt˦6Z$ivISf{9+K3왓< XQ\^-~,ЉL4$Wb51a '!Lt&:BLϷGY&{Z6:Ӳi% }(yܝR4Q`=y 2MpB(cPn(Jeb]qiyԒ/ͩUR|2 ĀXU [/[KMfPuЬdAqMC "f̗՜қzGj0^JKpnE{pz8l=sV7u7huq3.OP8t&3>vT!X^RAM_nwMeL4"ʄ+DSbpNQ^+294%~MECbA [gZb^h8xT0.O\@> X$XS "軎c9@qwG-8KV)l4ӞEOrd.1DcBٓͧŹ{ yg5TaD@GC 0̉eh/ KRa$mr*0AӀ(91Y,c!RV  Z IbP])/ $(0v|sg4ǀYL.GBSℳ$xI@m(Ψ3aj︋pl`1.y|x#! w-tSeJsMi9>i;15ʓ+3)Z¼ґD*niWyųM]k1 jb/܌LĴ[C't핳LtŽyQ>ƻ1lT,u&vJ0ESJ"$ !bt]B!bm;U0 OϾ^  V$GeCxJfLm2JO{F#s&rm!.AYJ41ozCl:-?k%bFp^"Q\Y9g{cy}OcdeY hҗ1DbRy3 9 /sC{[YE}Mxb@.9_D>XԾѲ+<!n3 !cFxB=/.g~@ֆ\|2|(w^1m&!}փ})iBnhYf|92q](<#",;d9G$7ξ>#(1#g}>{5?"E(hWRwA",3UDk?cw%aBҨ㳞=J;c7߽"]?6b2 __~Oa/&  ٘L>\`ufOEZ_󠵹Man*u=d"02D^F1c5I JSW6ΔZӛK )^xCnوɜ h]vȤ"=,傟liͰRR۷-b,fD_nHKܙqu.R4v1ʨLZ*զD-*:r(1eZTCH31eU 4ݽu"A0% f9r1ш%C+`@dpu%z'WUP6Pq@>/ʎ >GLv/A! Wלr.Շ+U,@w<i@G8C6ksW ZViR#]GE o!~]0Ip>EFL8U._ ݞκؗɹ3>mə^|9$>L"p:]>\U8Qe.7?_F*Fo& S(޽e(@P7fvQ}E5.}N6č-n"l{at"l%:*SRmӡ2e(eYRHfcDRS{r )Ys׆jc0uvQ}Mu^m_GTy_)1-q׸ F&^qh|YZFM$OP0uBnA4ĂkH~Z%մî-DSNF) QyK4@FfM1@5]! g2_n~o˹dܲcz}!RԦM =`QVh꥙r7է[T-u@*9mfSVgvrէVoT&.ѳn6CZﴎ#j'6I"') ʎ;nUHz{ϫm> `e=3jJ !K4|FI!z Id<`8K5  KQh'O<-~YN|6 hDQښK[9%0,"\2j,+H\f$e=Ep0[i˯U4˪20ʏD9##rh" + B8~,,r Y,_I\?6%N $qL M4Se\zRAd: 1p!jGmƉN oFPI#~ jкpF]f$,w1+p+TԐ|oͩ=ݦ֚ZWB10zD&UZWov.eV-SQOiu䖥Alivf,#zHLB>#/t+¡V?U[wh{8$4X"ޓ7fU{bmjӏs UifًQ]^Φ}N|-=MOoR* WZ?ݖAZU&WiL7~׵K !-$~Gt*qx<;AR<"2Wc$ah*-j`'/KsҬۧBF+~u$>6fUћ:_GVܾ[ESqaJX / &Ńnz ky8k'-6q \wʋD7xWO3~4&ODZlE&kH&;V#ǀMNN YR PEL 2+#aJ9~8lXlmc_a%775ӳ)'ɱ+^KR,`0is3o$p5$H $t60|@:Bߥx$tH9S*(&JC ZHD[rMS |t&=qC`vw_&kqu+hom9a^Yi- wb#^L\ӛ x O޾yqzl$>Japr5xM]Q}U&%|>k! ` =·C|k$A@Æia(*%Zyؗ]PwC%j}-0 D+o[R@v$JOS-\@"$m D4 o'\icu$o@TD+Yc(7'%s/wF+\[k:c7.\ 7G "R|&_-l.( z]cp m`[k6PV_{黸?+_Q ?L3|zX?ߦ37F$*kt95CZ RM%@R,KKs zwCT{WDxcg!RWD{k$P # O ^bWU@ :r&d;$ҝl2>u"}.A29ݔL)"/ 8{!| ;sUwCŜLX "_zxqV˗5O<\Rv;kPs49nRN|$/| ;F4gQa˵۠z޸2t@<+0irUT\C &?:Njt4," Z +-aP>­㓄2JccB_q[Pbrm]t~TN|~Ukl ४?;|w,e M6G ^w3v2iœHG h[ZJm.5h5/S&2PLQ[{[Xnt ve,]6׈͢q$u 9 o>~[{%­Q OTDﲟt}C\CVA(N`e[%yHh;yt3nZVFG3jM.k?tu 17BD~#y_k 71$ DMPrrTE'hA3Pw֠T50͇xi8pj\ٸ[E٥8EN$bvJK'%ZPt 2MYt~El!M$f/!?LDYb{΢F) I#O}F|B(aNAy:!8׊6VRRh*ndiN]F,Uţ@۔#oZG Z5_tJSX,NPR KlO[#ԡ4PV􀮊sTVi%9u9p X0|D^.<{<1F6:ԕJZ,"x iX'A"7=249b}W|=nq`FE,͜HMČK! qv"Re -#T/V LT:l[B!쉰BWHX"KX(k>>tef)h 8*"KB3crx66pC쁘ΘyR%9ilQ MD?d =a&G>1ITCT <3`-6\"wԝUM9d?ƈVU y*ZRFT@[O.}mPwVbӮ\Y^ZKe-4(j9Ay E=r h#kPXͫXjYcj4)jP(D.4ZhP(<Ԕh zTJ>Р"Py)ԍ=&La{jP2NN.XʀqU (Jˬ2ڌZ }P-jCYM`!hM`SjJ4޿7T7G!J\ KXDNs d;JH\2Tq#VC,A,S^𵇫 Dvyt]Ƴ OaAkieZS=jثk][njwlȼF,#}dlwac}c{9r# Cպd[&7p7cq~}7z3筸u9kkzQ; P|[vҋFqMqW[KL d TJ)(nhd GBW9jaoB+#KGF!-ڄyCiuNRY9D|.bL acA  ;_)od.Ryda/!9QȀQlNT)v4w~=[\6{dY|yrFǜ ._c -5F841]v[&a)ulh7/z6KmqbeؚΑ\L#p,R|&_-VkZPF=_U8UfV^p3q51匀^HќC-xQi"9<Ã]x E&1&MU re6MU yefQC2ү=SΪ+6+P jMNǖb58jT]CM,ܚxaLī|!q!b=8?ڥrZa8䉳h%WݣL^nR{V!SHԳFǔn#!8䉳 O@>zdmbɇ;r>AJ=j* kPj`0kPYh\\FÜj HaJ?YEIwO"Xy?ϴ <;J|Z 8JG-D]F9|TWKȝ?]|iG6nɬ LNFYu`(Z&uٚ|◻kCoP wq1ox/"m|/Θe2Tq)F z%6W)wQ zAuIoTRO.#&Z4 6+b"%42GrC2 3T$U<N*t\Te|~!{2xeGխt^)y˫g=mn 8'FNG7x O޾yqRb2-< xv&%O $t^ g~<]eoyAN|Xt,)P`y&O| u2 n|( ^U,X[":Hb|^\9w6ٙBXnwsp)&(|̍i~/YY04lKǧ.*cڟ \]\([ ՠC 0QۊrYv_ݍ8b1h;QSywƴ&TV/OazcLW̓UB}NY6d^#)ciOU >QKE>?;Pr&?swR  QKcrQi#RYbDo/y¦{ D>C}䝡fhD_J##''Si.@uy)iF@R j 0C5- )%4g54iӰFqu`iԝUȦ#˦HpcjamJ Y mY"Y!, fT>B0 [@I4e%1ŚFY`1 j6-.SM6y]0# I5u5%ՔՔf@s'ZJAtIE>Q|I3s)o˦p>h7=;v0KJ̳HӔQOU3߶sYm|hv_|aY_S,L|Owiv[O`B 16^U۫Bha6ڟNճ }Դ}f {@z}mᬈnuvH8!Cy|yj1 T$-4M!1E91+5j՗ݷoRvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000004450260115157020300017670 0ustar rootrootMar 19 15:15:39 crc systemd[1]: Starting Kubernetes Kubelet... Mar 19 15:15:39 crc restorecon[4668]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 15:15:40 crc restorecon[4668]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 15:15:40 crc restorecon[4668]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 19 15:15:41 crc kubenswrapper[4771]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 15:15:41 crc kubenswrapper[4771]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 19 15:15:41 crc kubenswrapper[4771]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 15:15:41 crc kubenswrapper[4771]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 15:15:41 crc kubenswrapper[4771]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 19 15:15:41 crc kubenswrapper[4771]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.237648 4771 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.245820 4771 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.245857 4771 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.245868 4771 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.245882 4771 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.245899 4771 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.245913 4771 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.245926 4771 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.245937 4771 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.245959 4771 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.245968 4771 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.245978 4771 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246016 4771 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246025 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246033 4771 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246042 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246051 4771 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246059 4771 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246067 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246075 4771 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246083 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246093 4771 feature_gate.go:330] unrecognized feature gate: Example Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246102 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246111 4771 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246121 4771 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246131 4771 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246141 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246151 4771 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246161 4771 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246171 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246179 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246187 4771 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246196 4771 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246206 4771 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246215 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246223 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246232 4771 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246243 4771 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246251 4771 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246259 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246268 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246277 4771 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246286 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246295 4771 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246303 4771 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246341 4771 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246349 4771 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246358 4771 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246367 4771 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246375 4771 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246384 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246395 4771 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246405 4771 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246414 4771 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246425 4771 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246433 4771 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246444 4771 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246453 4771 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246461 4771 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246469 4771 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246477 4771 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246485 4771 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246493 4771 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246502 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246510 4771 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246518 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246529 4771 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246540 4771 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246552 4771 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246562 4771 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246570 4771 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.246579 4771 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.246737 4771 flags.go:64] FLAG: --address="0.0.0.0" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.246757 4771 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.246772 4771 flags.go:64] FLAG: --anonymous-auth="true" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.246785 4771 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.246796 4771 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.246806 4771 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.246818 4771 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.246830 4771 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.246840 4771 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.246850 4771 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.246860 4771 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.246871 4771 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.246882 4771 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.246893 4771 flags.go:64] FLAG: --cgroup-root="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.246904 4771 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.246914 4771 flags.go:64] FLAG: --client-ca-file="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.246927 4771 flags.go:64] FLAG: --cloud-config="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.246940 4771 flags.go:64] FLAG: --cloud-provider="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.246951 4771 flags.go:64] FLAG: --cluster-dns="[]" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.246967 4771 flags.go:64] FLAG: --cluster-domain="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.246978 4771 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247024 4771 flags.go:64] FLAG: --config-dir="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247035 4771 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247047 4771 flags.go:64] FLAG: --container-log-max-files="5" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247061 4771 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247072 4771 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247084 4771 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247096 4771 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247107 4771 flags.go:64] FLAG: --contention-profiling="false" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247118 4771 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247129 4771 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247141 4771 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247153 4771 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247168 4771 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247179 4771 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247191 4771 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247202 4771 flags.go:64] FLAG: --enable-load-reader="false" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247212 4771 flags.go:64] FLAG: --enable-server="true" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247223 4771 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247236 4771 flags.go:64] FLAG: --event-burst="100" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247247 4771 flags.go:64] FLAG: --event-qps="50" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247257 4771 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247267 4771 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247278 4771 flags.go:64] FLAG: --eviction-hard="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247289 4771 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247299 4771 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247309 4771 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247320 4771 flags.go:64] FLAG: --eviction-soft="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247331 4771 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247341 4771 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247351 4771 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247361 4771 flags.go:64] FLAG: --experimental-mounter-path="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247370 4771 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247380 4771 flags.go:64] FLAG: --fail-swap-on="true" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247390 4771 flags.go:64] FLAG: --feature-gates="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247402 4771 flags.go:64] FLAG: --file-check-frequency="20s" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247411 4771 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247421 4771 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247431 4771 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247441 4771 flags.go:64] FLAG: --healthz-port="10248" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247450 4771 flags.go:64] FLAG: --help="false" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247460 4771 flags.go:64] FLAG: --hostname-override="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247469 4771 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247479 4771 flags.go:64] FLAG: --http-check-frequency="20s" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247489 4771 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247499 4771 flags.go:64] FLAG: --image-credential-provider-config="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247508 4771 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247518 4771 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247529 4771 flags.go:64] FLAG: --image-service-endpoint="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247539 4771 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247548 4771 flags.go:64] FLAG: --kube-api-burst="100" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247558 4771 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247568 4771 flags.go:64] FLAG: --kube-api-qps="50" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247577 4771 flags.go:64] FLAG: --kube-reserved="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247587 4771 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247596 4771 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247606 4771 flags.go:64] FLAG: --kubelet-cgroups="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247615 4771 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247624 4771 flags.go:64] FLAG: --lock-file="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247634 4771 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247644 4771 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247654 4771 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247668 4771 flags.go:64] FLAG: --log-json-split-stream="false" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247679 4771 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247689 4771 flags.go:64] FLAG: --log-text-split-stream="false" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247702 4771 flags.go:64] FLAG: --logging-format="text" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247713 4771 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247726 4771 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247736 4771 flags.go:64] FLAG: --manifest-url="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247745 4771 flags.go:64] FLAG: --manifest-url-header="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247757 4771 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247767 4771 flags.go:64] FLAG: --max-open-files="1000000" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247778 4771 flags.go:64] FLAG: --max-pods="110" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247788 4771 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247797 4771 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247807 4771 flags.go:64] FLAG: --memory-manager-policy="None" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247816 4771 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247828 4771 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247838 4771 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247847 4771 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247869 4771 flags.go:64] FLAG: --node-status-max-images="50" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247880 4771 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247890 4771 flags.go:64] FLAG: --oom-score-adj="-999" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247899 4771 flags.go:64] FLAG: --pod-cidr="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247910 4771 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247925 4771 flags.go:64] FLAG: --pod-manifest-path="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247935 4771 flags.go:64] FLAG: --pod-max-pids="-1" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247944 4771 flags.go:64] FLAG: --pods-per-core="0" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247954 4771 flags.go:64] FLAG: --port="10250" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247964 4771 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.247974 4771 flags.go:64] FLAG: --provider-id="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248011 4771 flags.go:64] FLAG: --qos-reserved="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248022 4771 flags.go:64] FLAG: --read-only-port="10255" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248033 4771 flags.go:64] FLAG: --register-node="true" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248042 4771 flags.go:64] FLAG: --register-schedulable="true" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248052 4771 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248068 4771 flags.go:64] FLAG: --registry-burst="10" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248077 4771 flags.go:64] FLAG: --registry-qps="5" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248087 4771 flags.go:64] FLAG: --reserved-cpus="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248097 4771 flags.go:64] FLAG: --reserved-memory="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248110 4771 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248121 4771 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248133 4771 flags.go:64] FLAG: --rotate-certificates="false" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248144 4771 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248155 4771 flags.go:64] FLAG: --runonce="false" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248166 4771 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248178 4771 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248188 4771 flags.go:64] FLAG: --seccomp-default="false" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248198 4771 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248207 4771 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248217 4771 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248227 4771 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248238 4771 flags.go:64] FLAG: --storage-driver-password="root" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248248 4771 flags.go:64] FLAG: --storage-driver-secure="false" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248258 4771 flags.go:64] FLAG: --storage-driver-table="stats" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248268 4771 flags.go:64] FLAG: --storage-driver-user="root" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248278 4771 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248288 4771 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248297 4771 flags.go:64] FLAG: --system-cgroups="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248306 4771 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248323 4771 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248333 4771 flags.go:64] FLAG: --tls-cert-file="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248342 4771 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248355 4771 flags.go:64] FLAG: --tls-min-version="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248365 4771 flags.go:64] FLAG: --tls-private-key-file="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248374 4771 flags.go:64] FLAG: --topology-manager-policy="none" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248384 4771 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248394 4771 flags.go:64] FLAG: --topology-manager-scope="container" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248403 4771 flags.go:64] FLAG: --v="2" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248415 4771 flags.go:64] FLAG: --version="false" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248427 4771 flags.go:64] FLAG: --vmodule="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248438 4771 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.248449 4771 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248678 4771 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248690 4771 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248700 4771 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248709 4771 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248719 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248729 4771 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248738 4771 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248748 4771 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248756 4771 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248765 4771 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248774 4771 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248783 4771 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248793 4771 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248803 4771 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248812 4771 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248822 4771 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248831 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248840 4771 feature_gate.go:330] unrecognized feature gate: Example Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248852 4771 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248862 4771 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248872 4771 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248883 4771 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248894 4771 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248905 4771 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248920 4771 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248928 4771 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248937 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248945 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248954 4771 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248963 4771 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.248973 4771 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249014 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249026 4771 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249035 4771 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249044 4771 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249053 4771 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249063 4771 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249072 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249081 4771 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249091 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249100 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249110 4771 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249120 4771 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249129 4771 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249139 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249148 4771 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249158 4771 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249171 4771 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249184 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249193 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249202 4771 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249211 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249220 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249228 4771 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249243 4771 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249251 4771 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249268 4771 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249276 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249285 4771 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249297 4771 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249306 4771 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249315 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249324 4771 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249332 4771 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249340 4771 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249348 4771 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249357 4771 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249366 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249377 4771 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249388 4771 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.249397 4771 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.249425 4771 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.261131 4771 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.261182 4771 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261368 4771 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261388 4771 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261401 4771 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261414 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261428 4771 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261440 4771 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261451 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261491 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261503 4771 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261515 4771 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261526 4771 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261537 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261549 4771 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261560 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261571 4771 feature_gate.go:330] unrecognized feature gate: Example Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261583 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261594 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261604 4771 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261616 4771 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261627 4771 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261638 4771 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261650 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261661 4771 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261672 4771 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261683 4771 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261694 4771 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261705 4771 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261716 4771 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261727 4771 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261739 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261751 4771 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261763 4771 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261775 4771 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261786 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261805 4771 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261821 4771 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261834 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261845 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261858 4771 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261876 4771 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261888 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261899 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261911 4771 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261922 4771 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261933 4771 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261944 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261955 4771 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261966 4771 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.261976 4771 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262023 4771 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262036 4771 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262047 4771 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262058 4771 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262070 4771 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262081 4771 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262093 4771 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262104 4771 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262116 4771 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262127 4771 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262137 4771 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262152 4771 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262170 4771 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262212 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262228 4771 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262241 4771 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262252 4771 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262263 4771 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262274 4771 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262285 4771 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262295 4771 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262307 4771 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.262324 4771 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262663 4771 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262682 4771 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262694 4771 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262707 4771 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262718 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262730 4771 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262741 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262755 4771 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262771 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262783 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262794 4771 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262809 4771 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262822 4771 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262835 4771 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262849 4771 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262860 4771 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262872 4771 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262883 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262894 4771 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262905 4771 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262916 4771 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262927 4771 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262937 4771 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262951 4771 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262965 4771 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.262977 4771 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263026 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263038 4771 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263048 4771 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263059 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263070 4771 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263080 4771 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263091 4771 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263102 4771 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263116 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263128 4771 feature_gate.go:330] unrecognized feature gate: Example Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263139 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263149 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263161 4771 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263171 4771 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263183 4771 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263194 4771 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263205 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263215 4771 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263227 4771 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263238 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263249 4771 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263259 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263270 4771 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263280 4771 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263291 4771 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263301 4771 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263312 4771 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263323 4771 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263333 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263345 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263355 4771 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263366 4771 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263377 4771 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263388 4771 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263398 4771 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263410 4771 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263420 4771 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263431 4771 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263442 4771 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263452 4771 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263464 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263478 4771 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263493 4771 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263507 4771 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.263521 4771 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.263539 4771 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.264933 4771 server.go:940] "Client rotation is on, will bootstrap in background" Mar 19 15:15:41 crc kubenswrapper[4771]: E0319 15:15:41.271419 4771 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.275049 4771 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.275157 4771 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.276969 4771 server.go:997] "Starting client certificate rotation" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.277020 4771 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.277223 4771 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 15:15:41 crc kubenswrapper[4771]: E0319 15:15:41.309783 4771 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.311231 4771 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.314681 4771 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.333782 4771 log.go:25] "Validated CRI v1 runtime API" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.374135 4771 log.go:25] "Validated CRI v1 image API" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.376527 4771 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.381916 4771 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-19-15-10-50-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.381968 4771 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.410045 4771 manager.go:217] Machine: {Timestamp:2026-03-19 15:15:41.406079894 +0000 UTC m=+0.634701166 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:c77dd57c-51a3-4dec-a58e-8126c5679a04 BootID:03b7c304-29fa-4242-a7e5-f84ad5b17d5b Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:5e:75:71 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:5e:75:71 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:28:f9:70 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:90:67:db Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:26:b6:eb Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:34:34:a6 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:7e:5b:15:0f:45:2b Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:16:67:dc:2c:2d:ec Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.410786 4771 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.411144 4771 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.411614 4771 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.411900 4771 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.411973 4771 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.412341 4771 topology_manager.go:138] "Creating topology manager with none policy" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.412359 4771 container_manager_linux.go:303] "Creating device plugin manager" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.413111 4771 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.413181 4771 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.413951 4771 state_mem.go:36] "Initialized new in-memory state store" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.414156 4771 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.418313 4771 kubelet.go:418] "Attempting to sync node with API server" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.418350 4771 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.418374 4771 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.418394 4771 kubelet.go:324] "Adding apiserver pod source" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.418412 4771 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.424701 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.424701 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Mar 19 15:15:41 crc kubenswrapper[4771]: E0319 15:15:41.424804 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 19 15:15:41 crc kubenswrapper[4771]: E0319 15:15:41.424852 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.425538 4771 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.426822 4771 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.428798 4771 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.431033 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.431088 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.431104 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.431122 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.431152 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.431169 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.431185 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.431208 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.431225 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.431239 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.431260 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.431286 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.432691 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.433430 4771 server.go:1280] "Started kubelet" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.433694 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.434341 4771 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.434347 4771 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 19 15:15:41 crc systemd[1]: Started Kubernetes Kubelet. Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.435782 4771 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.436393 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.436434 4771 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.436749 4771 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.436810 4771 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 19 15:15:41 crc kubenswrapper[4771]: E0319 15:15:41.437041 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.437134 4771 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 19 15:15:41 crc kubenswrapper[4771]: E0319 15:15:41.438105 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="200ms" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.438279 4771 server.go:460] "Adding debug handlers to kubelet server" Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.438711 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Mar 19 15:15:41 crc kubenswrapper[4771]: E0319 15:15:41.438830 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.438915 4771 factory.go:55] Registering systemd factory Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.438939 4771 factory.go:221] Registration of the systemd container factory successfully Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.439734 4771 factory.go:153] Registering CRI-O factory Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.439769 4771 factory.go:221] Registration of the crio container factory successfully Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.439868 4771 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.439901 4771 factory.go:103] Registering Raw factory Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.439924 4771 manager.go:1196] Started watching for new ooms in manager Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.441958 4771 manager.go:319] Starting recovery of all containers Mar 19 15:15:41 crc kubenswrapper[4771]: E0319 15:15:41.441671 4771 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.50:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e46f9af5bf5e7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.433394663 +0000 UTC m=+0.662015905,LastTimestamp:2026-03-19 15:15:41.433394663 +0000 UTC m=+0.662015905,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.460724 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.460808 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.460838 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.460859 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.460883 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.460908 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.460936 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.460962 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461027 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461051 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461073 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461098 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461121 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461149 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461173 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461193 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461216 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461276 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461303 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461328 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461352 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461374 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461395 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461416 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461439 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461467 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461497 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461526 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461548 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461571 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461603 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461628 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461654 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461680 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461702 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461758 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461783 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461808 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461830 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461853 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461880 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461904 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461930 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461957 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.461981 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462038 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462063 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462089 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462112 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462133 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462155 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462178 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462209 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462238 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462264 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462287 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462310 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462334 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462359 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462387 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462409 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462431 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462451 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462473 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462499 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462526 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462548 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462571 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462595 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462620 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462645 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462670 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462705 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462730 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462823 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462852 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462877 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.462902 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465276 4771 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465319 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465343 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465368 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465401 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465421 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465442 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465463 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465483 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465504 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465524 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465545 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465565 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465586 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465606 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465627 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465646 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465668 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465687 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465707 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465727 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465754 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465775 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465795 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465815 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465836 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465862 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465912 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465935 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465957 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.465980 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466035 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466057 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466080 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466102 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466126 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466147 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466169 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466192 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466214 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466234 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466254 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466277 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466298 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466318 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466342 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466362 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466390 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466409 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466431 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466451 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466472 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466492 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466517 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466540 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466561 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466581 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466603 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466624 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466644 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466663 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466685 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466707 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466728 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466749 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466771 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466792 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466813 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466833 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466854 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466875 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466897 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466917 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466940 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466961 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.466982 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467042 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467063 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467083 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467105 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467125 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467146 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467166 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467185 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467205 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467226 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467245 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467265 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467285 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467305 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467324 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467344 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467365 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467386 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467404 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467423 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467445 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467463 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467483 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467501 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467520 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467540 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467560 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467580 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467600 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467620 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467641 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467660 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467679 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467699 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467721 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467761 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467780 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467800 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467820 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467839 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467859 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467878 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467897 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467917 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467940 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467958 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.467978 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.468032 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.468051 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.468071 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.468091 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.468110 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.468129 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.468148 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.468168 4771 reconstruct.go:97] "Volume reconstruction finished" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.468182 4771 reconciler.go:26] "Reconciler: start to sync state" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.481382 4771 manager.go:324] Recovery completed Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.493517 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.495385 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.495437 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.495448 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.496593 4771 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.496754 4771 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.496879 4771 state_mem.go:36] "Initialized new in-memory state store" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.504135 4771 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.507338 4771 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.507383 4771 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.507418 4771 kubelet.go:2335] "Starting kubelet main sync loop" Mar 19 15:15:41 crc kubenswrapper[4771]: E0319 15:15:41.507487 4771 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 19 15:15:41 crc kubenswrapper[4771]: W0319 15:15:41.508144 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Mar 19 15:15:41 crc kubenswrapper[4771]: E0319 15:15:41.508231 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.526561 4771 policy_none.go:49] "None policy: Start" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.527835 4771 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.527868 4771 state_mem.go:35] "Initializing new in-memory state store" Mar 19 15:15:41 crc kubenswrapper[4771]: E0319 15:15:41.537819 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.590643 4771 manager.go:334] "Starting Device Plugin manager" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.590951 4771 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.590975 4771 server.go:79] "Starting device plugin registration server" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.591642 4771 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.591675 4771 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.591847 4771 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.592107 4771 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.592124 4771 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.607816 4771 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.608044 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.609958 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.610054 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.610074 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.610328 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.610682 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.610771 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:41 crc kubenswrapper[4771]: E0319 15:15:41.611226 4771 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.611824 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.611883 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.611911 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.612154 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.612184 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.612201 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.612221 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.612555 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.612616 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.613864 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.613914 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.613932 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.615468 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.615524 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.615541 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.615783 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.615957 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.616038 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.617207 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.617240 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.617275 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.617409 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.617558 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.617614 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.617922 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.617950 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.617964 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.619100 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.619147 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.619157 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.619195 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.619178 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.619312 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.619535 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.619567 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.620337 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.620367 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.620378 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:41 crc kubenswrapper[4771]: E0319 15:15:41.639269 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="400ms" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.670293 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.670339 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.670377 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.670441 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.670506 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.670534 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.670619 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.670690 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.670766 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.670815 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.670846 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.670903 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.670927 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.670950 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.670973 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.692421 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.693816 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.693883 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.693902 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.693941 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 15:15:41 crc kubenswrapper[4771]: E0319 15:15:41.694486 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.50:6443: connect: connection refused" node="crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.772534 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.772603 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.772640 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.772671 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.772745 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.772807 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.772814 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.772864 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.772934 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.772966 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.773043 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.773020 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.773136 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.773186 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.773235 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.773188 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.773272 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.773310 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.773288 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.773343 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.773364 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.773362 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.773380 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.773419 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.773340 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.773440 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.773411 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.773555 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.773312 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.773711 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.895479 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.897393 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.897488 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.897578 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.897649 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 15:15:41 crc kubenswrapper[4771]: E0319 15:15:41.898516 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.50:6443: connect: connection refused" node="crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.967463 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 15:15:41 crc kubenswrapper[4771]: I0319 15:15:41.979962 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 19 15:15:42 crc kubenswrapper[4771]: I0319 15:15:42.009442 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:15:42 crc kubenswrapper[4771]: W0319 15:15:42.018270 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-2b808c585b944e8f3586634b4918632cbeb6f5cbb9d9785c200b9dc134ead7be WatchSource:0}: Error finding container 2b808c585b944e8f3586634b4918632cbeb6f5cbb9d9785c200b9dc134ead7be: Status 404 returned error can't find the container with id 2b808c585b944e8f3586634b4918632cbeb6f5cbb9d9785c200b9dc134ead7be Mar 19 15:15:42 crc kubenswrapper[4771]: W0319 15:15:42.024717 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-8341ab014d7b57a4552ece756c6ed17c0846ab9c7713827aa133e2b4a95c44ca WatchSource:0}: Error finding container 8341ab014d7b57a4552ece756c6ed17c0846ab9c7713827aa133e2b4a95c44ca: Status 404 returned error can't find the container with id 8341ab014d7b57a4552ece756c6ed17c0846ab9c7713827aa133e2b4a95c44ca Mar 19 15:15:42 crc kubenswrapper[4771]: I0319 15:15:42.033587 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 15:15:42 crc kubenswrapper[4771]: E0319 15:15:42.040653 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="800ms" Mar 19 15:15:42 crc kubenswrapper[4771]: W0319 15:15:42.042063 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-57be0e9b38816ebfea600d76d4e531d560b8d2a17946c882127b8ec73789790e WatchSource:0}: Error finding container 57be0e9b38816ebfea600d76d4e531d560b8d2a17946c882127b8ec73789790e: Status 404 returned error can't find the container with id 57be0e9b38816ebfea600d76d4e531d560b8d2a17946c882127b8ec73789790e Mar 19 15:15:42 crc kubenswrapper[4771]: I0319 15:15:42.043613 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 15:15:42 crc kubenswrapper[4771]: W0319 15:15:42.066831 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-56f31a236a308f8b52c3e2d0dc460364d27a4ed5a92352e32bf98ade4ef437dc WatchSource:0}: Error finding container 56f31a236a308f8b52c3e2d0dc460364d27a4ed5a92352e32bf98ade4ef437dc: Status 404 returned error can't find the container with id 56f31a236a308f8b52c3e2d0dc460364d27a4ed5a92352e32bf98ade4ef437dc Mar 19 15:15:42 crc kubenswrapper[4771]: W0319 15:15:42.082437 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-e408fc81039365deb29f4acba76147f0941433ca97bbbbbbdec2f8eec9e0c98a WatchSource:0}: Error finding container e408fc81039365deb29f4acba76147f0941433ca97bbbbbbdec2f8eec9e0c98a: Status 404 returned error can't find the container with id e408fc81039365deb29f4acba76147f0941433ca97bbbbbbdec2f8eec9e0c98a Mar 19 15:15:42 crc kubenswrapper[4771]: I0319 15:15:42.299660 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:42 crc kubenswrapper[4771]: I0319 15:15:42.302609 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:42 crc kubenswrapper[4771]: I0319 15:15:42.302673 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:42 crc kubenswrapper[4771]: I0319 15:15:42.302693 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:42 crc kubenswrapper[4771]: I0319 15:15:42.302737 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 15:15:42 crc kubenswrapper[4771]: E0319 15:15:42.303382 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.50:6443: connect: connection refused" node="crc" Mar 19 15:15:42 crc kubenswrapper[4771]: W0319 15:15:42.372141 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Mar 19 15:15:42 crc kubenswrapper[4771]: E0319 15:15:42.372294 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 19 15:15:42 crc kubenswrapper[4771]: I0319 15:15:42.434983 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Mar 19 15:15:42 crc kubenswrapper[4771]: I0319 15:15:42.513378 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"56f31a236a308f8b52c3e2d0dc460364d27a4ed5a92352e32bf98ade4ef437dc"} Mar 19 15:15:42 crc kubenswrapper[4771]: I0319 15:15:42.514399 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"57be0e9b38816ebfea600d76d4e531d560b8d2a17946c882127b8ec73789790e"} Mar 19 15:15:42 crc kubenswrapper[4771]: I0319 15:15:42.516016 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8341ab014d7b57a4552ece756c6ed17c0846ab9c7713827aa133e2b4a95c44ca"} Mar 19 15:15:42 crc kubenswrapper[4771]: I0319 15:15:42.517416 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"2b808c585b944e8f3586634b4918632cbeb6f5cbb9d9785c200b9dc134ead7be"} Mar 19 15:15:42 crc kubenswrapper[4771]: I0319 15:15:42.518542 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e408fc81039365deb29f4acba76147f0941433ca97bbbbbbdec2f8eec9e0c98a"} Mar 19 15:15:42 crc kubenswrapper[4771]: W0319 15:15:42.668289 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Mar 19 15:15:42 crc kubenswrapper[4771]: E0319 15:15:42.668467 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 19 15:15:42 crc kubenswrapper[4771]: W0319 15:15:42.752122 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Mar 19 15:15:42 crc kubenswrapper[4771]: E0319 15:15:42.752327 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 19 15:15:42 crc kubenswrapper[4771]: W0319 15:15:42.812629 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Mar 19 15:15:42 crc kubenswrapper[4771]: E0319 15:15:42.812713 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 19 15:15:42 crc kubenswrapper[4771]: E0319 15:15:42.842173 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="1.6s" Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.104102 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.105561 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.105607 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.105619 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.105651 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 15:15:43 crc kubenswrapper[4771]: E0319 15:15:43.106063 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.50:6443: connect: connection refused" node="crc" Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.434885 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.506973 4771 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 15:15:43 crc kubenswrapper[4771]: E0319 15:15:43.508660 4771 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.524741 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434" exitCode=0 Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.524835 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434"} Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.525143 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.526462 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.526497 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.526509 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.527574 4771 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="de8f745656bf82c5fd38635d48c5b7b34483a992c0e69091d391136f4f2a22b2" exitCode=0 Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.527620 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"de8f745656bf82c5fd38635d48c5b7b34483a992c0e69091d391136f4f2a22b2"} Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.527800 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.529981 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.530703 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.531098 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.531144 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.536210 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.536262 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.536282 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.536653 4771 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="a796f2dbeb7d1bba7c52e85cb054117dbe63faa3482fa5cd9e604d11bc54f9a1" exitCode=0 Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.536795 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"a796f2dbeb7d1bba7c52e85cb054117dbe63faa3482fa5cd9e604d11bc54f9a1"} Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.537197 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.539802 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.539861 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.539882 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.542597 4771 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb" exitCode=0 Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.542694 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb"} Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.542778 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.544859 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.544892 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.544905 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.546594 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"83324065bc3f8c142d9e97172aa6f22d07dc652071a0ed4365a449510d18b9e1"} Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.546656 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d88f1db3503c5fd2f12fc248d6274a805c322b793f1d6585968b39f5a461610e"} Mar 19 15:15:43 crc kubenswrapper[4771]: I0319 15:15:43.546678 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dcc1650f4cc184b940a4fad0a9d7c1d593ece8735e59aed1c66f4417c2b862e4"} Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.435458 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Mar 19 15:15:44 crc kubenswrapper[4771]: E0319 15:15:44.443266 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="3.2s" Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.557217 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03"} Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.557280 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695"} Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.557295 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344"} Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.557308 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59"} Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.559098 4771 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="fdcd904e4f9d721ba401d6244137a4698ce5197dcc2daadb2e68014f0bca1ff9" exitCode=0 Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.559214 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"fdcd904e4f9d721ba401d6244137a4698ce5197dcc2daadb2e68014f0bca1ff9"} Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.559261 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.562625 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.562665 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.562678 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.563961 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f7c0836517f4ced7fafcb12ff073d4ef8885f18aec49a1e28404bb23103c42c8"} Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.564032 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.566622 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.566669 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.566688 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.569073 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"fe2d7773b455689fcc7033bd53436cd5e99831482ede8da716f5470038b84250"} Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.569117 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.569121 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"809563b276b55898d5e5824345fbaf17170f5c3ca405104aa5973e91cb1293b2"} Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.569209 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"77527adccf33798bec152536c72459bb99ca3e53327a4c748d7544d99d5e7e7d"} Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.570004 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.570057 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.570073 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.571658 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5926799a8fe3e378dd794ebae2a622e7ea61fef68043085471dd9de44e40baa8"} Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.571747 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.572508 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.572549 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.572561 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.706326 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.707530 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.707623 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.707688 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:44 crc kubenswrapper[4771]: I0319 15:15:44.707759 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 15:15:44 crc kubenswrapper[4771]: E0319 15:15:44.708251 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.50:6443: connect: connection refused" node="crc" Mar 19 15:15:44 crc kubenswrapper[4771]: W0319 15:15:44.852964 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Mar 19 15:15:44 crc kubenswrapper[4771]: E0319 15:15:44.853096 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 19 15:15:44 crc kubenswrapper[4771]: W0319 15:15:44.946646 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Mar 19 15:15:44 crc kubenswrapper[4771]: E0319 15:15:44.946760 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 19 15:15:45 crc kubenswrapper[4771]: I0319 15:15:45.580967 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d7e7e6ba64577ff06e03577513cc084a5674538f4e070540c4773ab503a883f8"} Mar 19 15:15:45 crc kubenswrapper[4771]: I0319 15:15:45.581926 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:45 crc kubenswrapper[4771]: I0319 15:15:45.583263 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:45 crc kubenswrapper[4771]: I0319 15:15:45.583309 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:45 crc kubenswrapper[4771]: I0319 15:15:45.583327 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:45 crc kubenswrapper[4771]: I0319 15:15:45.585387 4771 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6f6af54a4c433b2f583c08b1a7e50542298041f410c9190fd8f348b5275d8a08" exitCode=0 Mar 19 15:15:45 crc kubenswrapper[4771]: I0319 15:15:45.585498 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:45 crc kubenswrapper[4771]: I0319 15:15:45.585543 4771 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 15:15:45 crc kubenswrapper[4771]: I0319 15:15:45.585606 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:45 crc kubenswrapper[4771]: I0319 15:15:45.585596 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6f6af54a4c433b2f583c08b1a7e50542298041f410c9190fd8f348b5275d8a08"} Mar 19 15:15:45 crc kubenswrapper[4771]: I0319 15:15:45.585604 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:45 crc kubenswrapper[4771]: I0319 15:15:45.585694 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:45 crc kubenswrapper[4771]: I0319 15:15:45.586846 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:45 crc kubenswrapper[4771]: I0319 15:15:45.586922 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:45 crc kubenswrapper[4771]: I0319 15:15:45.586945 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:45 crc kubenswrapper[4771]: I0319 15:15:45.587360 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:45 crc kubenswrapper[4771]: I0319 15:15:45.587406 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:45 crc kubenswrapper[4771]: I0319 15:15:45.587425 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:45 crc kubenswrapper[4771]: I0319 15:15:45.587517 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:45 crc kubenswrapper[4771]: I0319 15:15:45.587547 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:45 crc kubenswrapper[4771]: I0319 15:15:45.587569 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:45 crc kubenswrapper[4771]: I0319 15:15:45.587691 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:45 crc kubenswrapper[4771]: I0319 15:15:45.587727 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:45 crc kubenswrapper[4771]: I0319 15:15:45.587745 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:46 crc kubenswrapper[4771]: I0319 15:15:46.081301 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:15:46 crc kubenswrapper[4771]: I0319 15:15:46.333338 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:15:46 crc kubenswrapper[4771]: I0319 15:15:46.594848 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:46 crc kubenswrapper[4771]: I0319 15:15:46.594979 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"96547efd86e809f3eeaecd398a7ce82d58b99e95ee29d5dc0a08abae402c28cc"} Mar 19 15:15:46 crc kubenswrapper[4771]: I0319 15:15:46.595068 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"454b435a5eabdf4a64f4e08ed67ad3926ea1c357ff0bec140aaf1b21bfd6a5c1"} Mar 19 15:15:46 crc kubenswrapper[4771]: I0319 15:15:46.595091 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"950511c16fe7428ba42f181865b5f6d48c15a001fd40682fe33cb19fabb07482"} Mar 19 15:15:46 crc kubenswrapper[4771]: I0319 15:15:46.596300 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:46 crc kubenswrapper[4771]: I0319 15:15:46.596333 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:46 crc kubenswrapper[4771]: I0319 15:15:46.596350 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:46 crc kubenswrapper[4771]: I0319 15:15:46.674290 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 15:15:46 crc kubenswrapper[4771]: I0319 15:15:46.674505 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:46 crc kubenswrapper[4771]: I0319 15:15:46.676055 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:46 crc kubenswrapper[4771]: I0319 15:15:46.676114 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:46 crc kubenswrapper[4771]: I0319 15:15:46.676134 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:46 crc kubenswrapper[4771]: I0319 15:15:46.716197 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 15:15:46 crc kubenswrapper[4771]: I0319 15:15:46.716503 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:46 crc kubenswrapper[4771]: I0319 15:15:46.718078 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:46 crc kubenswrapper[4771]: I0319 15:15:46.718125 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:46 crc kubenswrapper[4771]: I0319 15:15:46.718137 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:46 crc kubenswrapper[4771]: I0319 15:15:46.914368 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:15:47 crc kubenswrapper[4771]: I0319 15:15:47.393242 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 15:15:47 crc kubenswrapper[4771]: I0319 15:15:47.404430 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 15:15:47 crc kubenswrapper[4771]: I0319 15:15:47.602148 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ab23395a26fca7d054420d5106512839bed7889bc626449e23186e85233c76a6"} Mar 19 15:15:47 crc kubenswrapper[4771]: I0319 15:15:47.602213 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b56b40f0d1b17c4f9997ef2713d4a5ecf59283afa491298222f7c6fef56fd2aa"} Mar 19 15:15:47 crc kubenswrapper[4771]: I0319 15:15:47.602263 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:47 crc kubenswrapper[4771]: I0319 15:15:47.602279 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:47 crc kubenswrapper[4771]: I0319 15:15:47.602470 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:47 crc kubenswrapper[4771]: I0319 15:15:47.603822 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:47 crc kubenswrapper[4771]: I0319 15:15:47.603879 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:47 crc kubenswrapper[4771]: I0319 15:15:47.603902 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:47 crc kubenswrapper[4771]: I0319 15:15:47.603946 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:47 crc kubenswrapper[4771]: I0319 15:15:47.604022 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:47 crc kubenswrapper[4771]: I0319 15:15:47.603838 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:47 crc kubenswrapper[4771]: I0319 15:15:47.604087 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:47 crc kubenswrapper[4771]: I0319 15:15:47.604107 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:47 crc kubenswrapper[4771]: I0319 15:15:47.604050 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:47 crc kubenswrapper[4771]: I0319 15:15:47.651189 4771 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 15:15:47 crc kubenswrapper[4771]: I0319 15:15:47.908722 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:47 crc kubenswrapper[4771]: I0319 15:15:47.910350 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:47 crc kubenswrapper[4771]: I0319 15:15:47.910574 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:47 crc kubenswrapper[4771]: I0319 15:15:47.910723 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:47 crc kubenswrapper[4771]: I0319 15:15:47.910867 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 15:15:48 crc kubenswrapper[4771]: I0319 15:15:48.606733 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:48 crc kubenswrapper[4771]: I0319 15:15:48.606742 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:48 crc kubenswrapper[4771]: I0319 15:15:48.606784 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:48 crc kubenswrapper[4771]: I0319 15:15:48.609373 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:48 crc kubenswrapper[4771]: I0319 15:15:48.609571 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:48 crc kubenswrapper[4771]: I0319 15:15:48.609616 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:48 crc kubenswrapper[4771]: I0319 15:15:48.609578 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:48 crc kubenswrapper[4771]: I0319 15:15:48.609684 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:48 crc kubenswrapper[4771]: I0319 15:15:48.609634 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:48 crc kubenswrapper[4771]: I0319 15:15:48.609430 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:48 crc kubenswrapper[4771]: I0319 15:15:48.609897 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:48 crc kubenswrapper[4771]: I0319 15:15:48.609914 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:48 crc kubenswrapper[4771]: I0319 15:15:48.812174 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 19 15:15:49 crc kubenswrapper[4771]: I0319 15:15:49.609887 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:49 crc kubenswrapper[4771]: I0319 15:15:49.611527 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:49 crc kubenswrapper[4771]: I0319 15:15:49.611592 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:49 crc kubenswrapper[4771]: I0319 15:15:49.611615 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:50 crc kubenswrapper[4771]: I0319 15:15:50.232135 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 19 15:15:50 crc kubenswrapper[4771]: I0319 15:15:50.612243 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:50 crc kubenswrapper[4771]: I0319 15:15:50.613353 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:50 crc kubenswrapper[4771]: I0319 15:15:50.613423 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:50 crc kubenswrapper[4771]: I0319 15:15:50.613447 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:51 crc kubenswrapper[4771]: E0319 15:15:51.612225 4771 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 15:15:53 crc kubenswrapper[4771]: I0319 15:15:53.409636 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 15:15:53 crc kubenswrapper[4771]: I0319 15:15:53.409917 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:53 crc kubenswrapper[4771]: I0319 15:15:53.412254 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:53 crc kubenswrapper[4771]: I0319 15:15:53.412334 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:53 crc kubenswrapper[4771]: I0319 15:15:53.412357 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:53 crc kubenswrapper[4771]: I0319 15:15:53.416298 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 15:15:53 crc kubenswrapper[4771]: I0319 15:15:53.620129 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:53 crc kubenswrapper[4771]: I0319 15:15:53.621456 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:53 crc kubenswrapper[4771]: I0319 15:15:53.621510 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:53 crc kubenswrapper[4771]: I0319 15:15:53.621527 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:54 crc kubenswrapper[4771]: I0319 15:15:54.144944 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 15:15:54 crc kubenswrapper[4771]: I0319 15:15:54.623128 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:54 crc kubenswrapper[4771]: I0319 15:15:54.624516 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:54 crc kubenswrapper[4771]: I0319 15:15:54.624595 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:54 crc kubenswrapper[4771]: I0319 15:15:54.624633 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:55 crc kubenswrapper[4771]: W0319 15:15:55.228460 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 19 15:15:55 crc kubenswrapper[4771]: I0319 15:15:55.228625 4771 trace.go:236] Trace[958171951]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Mar-2026 15:15:45.226) (total time: 10002ms): Mar 19 15:15:55 crc kubenswrapper[4771]: Trace[958171951]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:15:55.228) Mar 19 15:15:55 crc kubenswrapper[4771]: Trace[958171951]: [10.00209052s] [10.00209052s] END Mar 19 15:15:55 crc kubenswrapper[4771]: E0319 15:15:55.228678 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 19 15:15:55 crc kubenswrapper[4771]: E0319 15:15:55.300851 4771 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.189e46f9af5bf5e7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.433394663 +0000 UTC m=+0.662015905,LastTimestamp:2026-03-19 15:15:41.433394663 +0000 UTC m=+0.662015905,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:15:55 crc kubenswrapper[4771]: I0319 15:15:55.435452 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 19 15:15:55 crc kubenswrapper[4771]: W0319 15:15:55.540944 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 19 15:15:55 crc kubenswrapper[4771]: I0319 15:15:55.541093 4771 trace.go:236] Trace[1475808063]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Mar-2026 15:15:45.539) (total time: 10001ms): Mar 19 15:15:55 crc kubenswrapper[4771]: Trace[1475808063]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:15:55.540) Mar 19 15:15:55 crc kubenswrapper[4771]: Trace[1475808063]: [10.001313951s] [10.001313951s] END Mar 19 15:15:55 crc kubenswrapper[4771]: E0319 15:15:55.541120 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 19 15:15:55 crc kubenswrapper[4771]: E0319 15:15:55.809719 4771 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:15:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 15:15:55 crc kubenswrapper[4771]: E0319 15:15:55.817235 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:15:55Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 15:15:55 crc kubenswrapper[4771]: I0319 15:15:55.821213 4771 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 15:15:55 crc kubenswrapper[4771]: I0319 15:15:55.821328 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 19 15:15:55 crc kubenswrapper[4771]: E0319 15:15:55.821978 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:15:55Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 19 15:15:55 crc kubenswrapper[4771]: W0319 15:15:55.823608 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:15:55Z is after 2026-02-23T05:33:13Z Mar 19 15:15:55 crc kubenswrapper[4771]: E0319 15:15:55.823722 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:15:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 15:15:55 crc kubenswrapper[4771]: W0319 15:15:55.830081 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:15:55Z is after 2026-02-23T05:33:13Z Mar 19 15:15:55 crc kubenswrapper[4771]: E0319 15:15:55.830205 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:15:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 15:15:55 crc kubenswrapper[4771]: I0319 15:15:55.839550 4771 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 15:15:55 crc kubenswrapper[4771]: I0319 15:15:55.839634 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 19 15:15:55 crc kubenswrapper[4771]: I0319 15:15:55.858311 4771 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:58864->192.168.126.11:17697: read: connection reset by peer" start-of-body= Mar 19 15:15:55 crc kubenswrapper[4771]: I0319 15:15:55.858367 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:58864->192.168.126.11:17697: read: connection reset by peer" Mar 19 15:15:56 crc kubenswrapper[4771]: I0319 15:15:56.334245 4771 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 19 15:15:56 crc kubenswrapper[4771]: I0319 15:15:56.334339 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 19 15:15:56 crc kubenswrapper[4771]: I0319 15:15:56.409941 4771 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 15:15:56 crc kubenswrapper[4771]: I0319 15:15:56.410133 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 15:15:56 crc kubenswrapper[4771]: I0319 15:15:56.437918 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:15:56Z is after 2026-02-23T05:33:13Z Mar 19 15:15:56 crc kubenswrapper[4771]: I0319 15:15:56.631425 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 19 15:15:56 crc kubenswrapper[4771]: I0319 15:15:56.634023 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d7e7e6ba64577ff06e03577513cc084a5674538f4e070540c4773ab503a883f8" exitCode=255 Mar 19 15:15:56 crc kubenswrapper[4771]: I0319 15:15:56.634081 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d7e7e6ba64577ff06e03577513cc084a5674538f4e070540c4773ab503a883f8"} Mar 19 15:15:56 crc kubenswrapper[4771]: I0319 15:15:56.634328 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:56 crc kubenswrapper[4771]: I0319 15:15:56.635459 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:56 crc kubenswrapper[4771]: I0319 15:15:56.635525 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:56 crc kubenswrapper[4771]: I0319 15:15:56.635549 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:56 crc kubenswrapper[4771]: I0319 15:15:56.636284 4771 scope.go:117] "RemoveContainer" containerID="d7e7e6ba64577ff06e03577513cc084a5674538f4e070540c4773ab503a883f8" Mar 19 15:15:56 crc kubenswrapper[4771]: I0319 15:15:56.919508 4771 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]log ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]etcd ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/generic-apiserver-start-informers ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/priority-and-fairness-filter ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/start-apiextensions-informers ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/start-apiextensions-controllers ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/crd-informer-synced ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/start-system-namespaces-controller ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 19 15:15:56 crc kubenswrapper[4771]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/bootstrap-controller ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/start-kube-aggregator-informers ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/apiservice-registration-controller ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/apiservice-discovery-controller ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]autoregister-completion ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/apiservice-openapi-controller ok Mar 19 15:15:56 crc kubenswrapper[4771]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 19 15:15:56 crc kubenswrapper[4771]: livez check failed Mar 19 15:15:56 crc kubenswrapper[4771]: I0319 15:15:56.919570 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 15:15:57 crc kubenswrapper[4771]: I0319 15:15:57.438729 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:15:57Z is after 2026-02-23T05:33:13Z Mar 19 15:15:57 crc kubenswrapper[4771]: I0319 15:15:57.638841 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 19 15:15:57 crc kubenswrapper[4771]: I0319 15:15:57.639558 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 19 15:15:57 crc kubenswrapper[4771]: I0319 15:15:57.641691 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ac4f82f78b3f2994e50d86c818b7f38236f5b5a0f6ffa94e345e276c7bde8f04" exitCode=255 Mar 19 15:15:57 crc kubenswrapper[4771]: I0319 15:15:57.641729 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ac4f82f78b3f2994e50d86c818b7f38236f5b5a0f6ffa94e345e276c7bde8f04"} Mar 19 15:15:57 crc kubenswrapper[4771]: I0319 15:15:57.641772 4771 scope.go:117] "RemoveContainer" containerID="d7e7e6ba64577ff06e03577513cc084a5674538f4e070540c4773ab503a883f8" Mar 19 15:15:57 crc kubenswrapper[4771]: I0319 15:15:57.641943 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:15:57 crc kubenswrapper[4771]: I0319 15:15:57.643259 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:15:57 crc kubenswrapper[4771]: I0319 15:15:57.643287 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:15:57 crc kubenswrapper[4771]: I0319 15:15:57.643296 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:15:57 crc kubenswrapper[4771]: I0319 15:15:57.643765 4771 scope.go:117] "RemoveContainer" containerID="ac4f82f78b3f2994e50d86c818b7f38236f5b5a0f6ffa94e345e276c7bde8f04" Mar 19 15:15:57 crc kubenswrapper[4771]: E0319 15:15:57.643924 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 15:15:58 crc kubenswrapper[4771]: I0319 15:15:58.439528 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:15:58Z is after 2026-02-23T05:33:13Z Mar 19 15:15:58 crc kubenswrapper[4771]: I0319 15:15:58.646209 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 19 15:15:59 crc kubenswrapper[4771]: W0319 15:15:59.258064 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:15:59Z is after 2026-02-23T05:33:13Z Mar 19 15:15:59 crc kubenswrapper[4771]: E0319 15:15:59.258172 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:15:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 15:15:59 crc kubenswrapper[4771]: I0319 15:15:59.440926 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:15:59Z is after 2026-02-23T05:33:13Z Mar 19 15:15:59 crc kubenswrapper[4771]: W0319 15:15:59.972827 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:15:59Z is after 2026-02-23T05:33:13Z Mar 19 15:15:59 crc kubenswrapper[4771]: E0319 15:15:59.972935 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:15:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 15:16:00 crc kubenswrapper[4771]: I0319 15:16:00.439893 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:00Z is after 2026-02-23T05:33:13Z Mar 19 15:16:00 crc kubenswrapper[4771]: I0319 15:16:00.457534 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 19 15:16:00 crc kubenswrapper[4771]: I0319 15:16:00.457787 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:00 crc kubenswrapper[4771]: I0319 15:16:00.459665 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:00 crc kubenswrapper[4771]: I0319 15:16:00.459719 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:00 crc kubenswrapper[4771]: I0319 15:16:00.459743 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:00 crc kubenswrapper[4771]: I0319 15:16:00.476965 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 19 15:16:00 crc kubenswrapper[4771]: I0319 15:16:00.653441 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:00 crc kubenswrapper[4771]: I0319 15:16:00.654784 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:00 crc kubenswrapper[4771]: I0319 15:16:00.654843 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:00 crc kubenswrapper[4771]: I0319 15:16:00.654866 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:01 crc kubenswrapper[4771]: I0319 15:16:01.439846 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:01Z is after 2026-02-23T05:33:13Z Mar 19 15:16:01 crc kubenswrapper[4771]: E0319 15:16:01.612590 4771 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 15:16:01 crc kubenswrapper[4771]: I0319 15:16:01.922909 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:16:01 crc kubenswrapper[4771]: I0319 15:16:01.923159 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:01 crc kubenswrapper[4771]: I0319 15:16:01.924673 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:01 crc kubenswrapper[4771]: I0319 15:16:01.924732 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:01 crc kubenswrapper[4771]: I0319 15:16:01.924750 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:01 crc kubenswrapper[4771]: I0319 15:16:01.925581 4771 scope.go:117] "RemoveContainer" containerID="ac4f82f78b3f2994e50d86c818b7f38236f5b5a0f6ffa94e345e276c7bde8f04" Mar 19 15:16:01 crc kubenswrapper[4771]: E0319 15:16:01.925863 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 15:16:01 crc kubenswrapper[4771]: I0319 15:16:01.929866 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:16:02 crc kubenswrapper[4771]: I0319 15:16:02.217904 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:02 crc kubenswrapper[4771]: I0319 15:16:02.219844 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:02 crc kubenswrapper[4771]: I0319 15:16:02.219907 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:02 crc kubenswrapper[4771]: I0319 15:16:02.219924 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:02 crc kubenswrapper[4771]: I0319 15:16:02.219968 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 15:16:02 crc kubenswrapper[4771]: E0319 15:16:02.225370 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:02Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 15:16:02 crc kubenswrapper[4771]: E0319 15:16:02.229311 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:02Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 15:16:02 crc kubenswrapper[4771]: I0319 15:16:02.440288 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:02Z is after 2026-02-23T05:33:13Z Mar 19 15:16:02 crc kubenswrapper[4771]: I0319 15:16:02.659106 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:02 crc kubenswrapper[4771]: I0319 15:16:02.660407 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:02 crc kubenswrapper[4771]: I0319 15:16:02.660479 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:02 crc kubenswrapper[4771]: I0319 15:16:02.660504 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:02 crc kubenswrapper[4771]: I0319 15:16:02.661565 4771 scope.go:117] "RemoveContainer" containerID="ac4f82f78b3f2994e50d86c818b7f38236f5b5a0f6ffa94e345e276c7bde8f04" Mar 19 15:16:02 crc kubenswrapper[4771]: E0319 15:16:02.661918 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 15:16:03 crc kubenswrapper[4771]: I0319 15:16:03.438811 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:03Z is after 2026-02-23T05:33:13Z Mar 19 15:16:04 crc kubenswrapper[4771]: I0319 15:16:04.120464 4771 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 15:16:04 crc kubenswrapper[4771]: E0319 15:16:04.125955 4771 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:04Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 15:16:04 crc kubenswrapper[4771]: I0319 15:16:04.437500 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:04Z is after 2026-02-23T05:33:13Z Mar 19 15:16:05 crc kubenswrapper[4771]: W0319 15:16:05.242469 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:05Z is after 2026-02-23T05:33:13Z Mar 19 15:16:05 crc kubenswrapper[4771]: E0319 15:16:05.242572 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:05Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 15:16:05 crc kubenswrapper[4771]: E0319 15:16:05.306776 4771 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:05Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e46f9af5bf5e7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.433394663 +0000 UTC m=+0.662015905,LastTimestamp:2026-03-19 15:15:41.433394663 +0000 UTC m=+0.662015905,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:05 crc kubenswrapper[4771]: I0319 15:16:05.439930 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:05Z is after 2026-02-23T05:33:13Z Mar 19 15:16:05 crc kubenswrapper[4771]: I0319 15:16:05.624451 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:16:05 crc kubenswrapper[4771]: I0319 15:16:05.624888 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:05 crc kubenswrapper[4771]: I0319 15:16:05.626467 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:05 crc kubenswrapper[4771]: I0319 15:16:05.626530 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:05 crc kubenswrapper[4771]: I0319 15:16:05.626553 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:05 crc kubenswrapper[4771]: I0319 15:16:05.627424 4771 scope.go:117] "RemoveContainer" containerID="ac4f82f78b3f2994e50d86c818b7f38236f5b5a0f6ffa94e345e276c7bde8f04" Mar 19 15:16:05 crc kubenswrapper[4771]: E0319 15:16:05.627707 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 15:16:05 crc kubenswrapper[4771]: W0319 15:16:05.760925 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:05Z is after 2026-02-23T05:33:13Z Mar 19 15:16:05 crc kubenswrapper[4771]: E0319 15:16:05.761319 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:05Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 15:16:06 crc kubenswrapper[4771]: I0319 15:16:06.333657 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:16:06 crc kubenswrapper[4771]: I0319 15:16:06.334775 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:06 crc kubenswrapper[4771]: I0319 15:16:06.336373 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:06 crc kubenswrapper[4771]: I0319 15:16:06.336605 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:06 crc kubenswrapper[4771]: I0319 15:16:06.336843 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:06 crc kubenswrapper[4771]: I0319 15:16:06.338195 4771 scope.go:117] "RemoveContainer" containerID="ac4f82f78b3f2994e50d86c818b7f38236f5b5a0f6ffa94e345e276c7bde8f04" Mar 19 15:16:06 crc kubenswrapper[4771]: E0319 15:16:06.338731 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 15:16:06 crc kubenswrapper[4771]: I0319 15:16:06.411041 4771 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 15:16:06 crc kubenswrapper[4771]: I0319 15:16:06.411458 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 15:16:06 crc kubenswrapper[4771]: I0319 15:16:06.439572 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:06Z is after 2026-02-23T05:33:13Z Mar 19 15:16:07 crc kubenswrapper[4771]: I0319 15:16:07.438913 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:07Z is after 2026-02-23T05:33:13Z Mar 19 15:16:08 crc kubenswrapper[4771]: I0319 15:16:08.439609 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:08Z is after 2026-02-23T05:33:13Z Mar 19 15:16:09 crc kubenswrapper[4771]: I0319 15:16:09.225977 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:09 crc kubenswrapper[4771]: I0319 15:16:09.227889 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:09 crc kubenswrapper[4771]: I0319 15:16:09.227977 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:09 crc kubenswrapper[4771]: I0319 15:16:09.228020 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:09 crc kubenswrapper[4771]: I0319 15:16:09.228058 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 15:16:09 crc kubenswrapper[4771]: E0319 15:16:09.233866 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:09Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 15:16:09 crc kubenswrapper[4771]: E0319 15:16:09.237810 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:09Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 15:16:09 crc kubenswrapper[4771]: I0319 15:16:09.440491 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:09Z is after 2026-02-23T05:33:13Z Mar 19 15:16:10 crc kubenswrapper[4771]: W0319 15:16:10.118032 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:10Z is after 2026-02-23T05:33:13Z Mar 19 15:16:10 crc kubenswrapper[4771]: E0319 15:16:10.118138 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:10Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 15:16:10 crc kubenswrapper[4771]: I0319 15:16:10.440457 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:10Z is after 2026-02-23T05:33:13Z Mar 19 15:16:10 crc kubenswrapper[4771]: W0319 15:16:10.451109 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:10Z is after 2026-02-23T05:33:13Z Mar 19 15:16:10 crc kubenswrapper[4771]: E0319 15:16:10.451217 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:10Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 15:16:11 crc kubenswrapper[4771]: I0319 15:16:11.439587 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:11Z is after 2026-02-23T05:33:13Z Mar 19 15:16:11 crc kubenswrapper[4771]: E0319 15:16:11.613538 4771 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 15:16:12 crc kubenswrapper[4771]: I0319 15:16:12.439560 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:12Z is after 2026-02-23T05:33:13Z Mar 19 15:16:13 crc kubenswrapper[4771]: I0319 15:16:13.439184 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:13Z is after 2026-02-23T05:33:13Z Mar 19 15:16:14 crc kubenswrapper[4771]: I0319 15:16:14.132172 4771 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:53904->192.168.126.11:10357: read: connection reset by peer" start-of-body= Mar 19 15:16:14 crc kubenswrapper[4771]: I0319 15:16:14.132283 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:53904->192.168.126.11:10357: read: connection reset by peer" Mar 19 15:16:14 crc kubenswrapper[4771]: I0319 15:16:14.132355 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 15:16:14 crc kubenswrapper[4771]: I0319 15:16:14.132538 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:14 crc kubenswrapper[4771]: I0319 15:16:14.134030 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:14 crc kubenswrapper[4771]: I0319 15:16:14.134085 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:14 crc kubenswrapper[4771]: I0319 15:16:14.134106 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:14 crc kubenswrapper[4771]: I0319 15:16:14.134844 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"d88f1db3503c5fd2f12fc248d6274a805c322b793f1d6585968b39f5a461610e"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 19 15:16:14 crc kubenswrapper[4771]: I0319 15:16:14.135134 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://d88f1db3503c5fd2f12fc248d6274a805c322b793f1d6585968b39f5a461610e" gracePeriod=30 Mar 19 15:16:14 crc kubenswrapper[4771]: I0319 15:16:14.439891 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:14Z is after 2026-02-23T05:33:13Z Mar 19 15:16:14 crc kubenswrapper[4771]: I0319 15:16:14.696429 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 15:16:14 crc kubenswrapper[4771]: I0319 15:16:14.697232 4771 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="d88f1db3503c5fd2f12fc248d6274a805c322b793f1d6585968b39f5a461610e" exitCode=255 Mar 19 15:16:14 crc kubenswrapper[4771]: I0319 15:16:14.697273 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"d88f1db3503c5fd2f12fc248d6274a805c322b793f1d6585968b39f5a461610e"} Mar 19 15:16:14 crc kubenswrapper[4771]: I0319 15:16:14.697304 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"600d83b8489925592a1793a8d1fdc5237fc52c2742f581701166f705577dd018"} Mar 19 15:16:14 crc kubenswrapper[4771]: I0319 15:16:14.697398 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:14 crc kubenswrapper[4771]: I0319 15:16:14.698383 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:14 crc kubenswrapper[4771]: I0319 15:16:14.698414 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:14 crc kubenswrapper[4771]: I0319 15:16:14.698425 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:15 crc kubenswrapper[4771]: E0319 15:16:15.313166 4771 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:15Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e46f9af5bf5e7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.433394663 +0000 UTC m=+0.662015905,LastTimestamp:2026-03-19 15:15:41.433394663 +0000 UTC m=+0.662015905,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:15 crc kubenswrapper[4771]: I0319 15:16:15.439890 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:15Z is after 2026-02-23T05:33:13Z Mar 19 15:16:15 crc kubenswrapper[4771]: I0319 15:16:15.701126 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:15 crc kubenswrapper[4771]: I0319 15:16:15.702410 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:15 crc kubenswrapper[4771]: I0319 15:16:15.702483 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:15 crc kubenswrapper[4771]: I0319 15:16:15.702510 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:16 crc kubenswrapper[4771]: I0319 15:16:16.234735 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:16 crc kubenswrapper[4771]: I0319 15:16:16.236516 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:16 crc kubenswrapper[4771]: I0319 15:16:16.236560 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:16 crc kubenswrapper[4771]: I0319 15:16:16.236576 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:16 crc kubenswrapper[4771]: I0319 15:16:16.236609 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 15:16:16 crc kubenswrapper[4771]: E0319 15:16:16.241616 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:16Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 15:16:16 crc kubenswrapper[4771]: E0319 15:16:16.243256 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:16Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 15:16:16 crc kubenswrapper[4771]: I0319 15:16:16.439529 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:16Z is after 2026-02-23T05:33:13Z Mar 19 15:16:17 crc kubenswrapper[4771]: I0319 15:16:17.439358 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:17Z is after 2026-02-23T05:33:13Z Mar 19 15:16:18 crc kubenswrapper[4771]: I0319 15:16:18.439701 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:18Z is after 2026-02-23T05:33:13Z Mar 19 15:16:18 crc kubenswrapper[4771]: I0319 15:16:18.508641 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:18 crc kubenswrapper[4771]: I0319 15:16:18.510199 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:18 crc kubenswrapper[4771]: I0319 15:16:18.510260 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:18 crc kubenswrapper[4771]: I0319 15:16:18.510278 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:18 crc kubenswrapper[4771]: I0319 15:16:18.511354 4771 scope.go:117] "RemoveContainer" containerID="ac4f82f78b3f2994e50d86c818b7f38236f5b5a0f6ffa94e345e276c7bde8f04" Mar 19 15:16:19 crc kubenswrapper[4771]: I0319 15:16:19.439459 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:19Z is after 2026-02-23T05:33:13Z Mar 19 15:16:19 crc kubenswrapper[4771]: I0319 15:16:19.716462 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 19 15:16:19 crc kubenswrapper[4771]: I0319 15:16:19.718827 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3c33f52b2cefe64aed6b27b5c2633a731ed17518bd9e9a377c4dfa8bd666fa26"} Mar 19 15:16:19 crc kubenswrapper[4771]: I0319 15:16:19.718967 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:19 crc kubenswrapper[4771]: I0319 15:16:19.720117 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:19 crc kubenswrapper[4771]: I0319 15:16:19.720201 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:19 crc kubenswrapper[4771]: I0319 15:16:19.720222 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:20 crc kubenswrapper[4771]: I0319 15:16:20.360090 4771 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 15:16:20 crc kubenswrapper[4771]: E0319 15:16:20.366489 4771 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:20Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 15:16:20 crc kubenswrapper[4771]: E0319 15:16:20.367753 4771 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 19 15:16:20 crc kubenswrapper[4771]: I0319 15:16:20.438801 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:20Z is after 2026-02-23T05:33:13Z Mar 19 15:16:20 crc kubenswrapper[4771]: I0319 15:16:20.724337 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 15:16:20 crc kubenswrapper[4771]: I0319 15:16:20.725184 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 19 15:16:20 crc kubenswrapper[4771]: I0319 15:16:20.727396 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3c33f52b2cefe64aed6b27b5c2633a731ed17518bd9e9a377c4dfa8bd666fa26" exitCode=255 Mar 19 15:16:20 crc kubenswrapper[4771]: I0319 15:16:20.727452 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3c33f52b2cefe64aed6b27b5c2633a731ed17518bd9e9a377c4dfa8bd666fa26"} Mar 19 15:16:20 crc kubenswrapper[4771]: I0319 15:16:20.727501 4771 scope.go:117] "RemoveContainer" containerID="ac4f82f78b3f2994e50d86c818b7f38236f5b5a0f6ffa94e345e276c7bde8f04" Mar 19 15:16:20 crc kubenswrapper[4771]: I0319 15:16:20.727663 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:20 crc kubenswrapper[4771]: I0319 15:16:20.729150 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:20 crc kubenswrapper[4771]: I0319 15:16:20.729203 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:20 crc kubenswrapper[4771]: I0319 15:16:20.729223 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:20 crc kubenswrapper[4771]: I0319 15:16:20.731104 4771 scope.go:117] "RemoveContainer" containerID="3c33f52b2cefe64aed6b27b5c2633a731ed17518bd9e9a377c4dfa8bd666fa26" Mar 19 15:16:20 crc kubenswrapper[4771]: E0319 15:16:20.731462 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 15:16:21 crc kubenswrapper[4771]: I0319 15:16:21.439962 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:21Z is after 2026-02-23T05:33:13Z Mar 19 15:16:21 crc kubenswrapper[4771]: E0319 15:16:21.613723 4771 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 15:16:21 crc kubenswrapper[4771]: I0319 15:16:21.733140 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 15:16:22 crc kubenswrapper[4771]: I0319 15:16:22.440084 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:22Z is after 2026-02-23T05:33:13Z Mar 19 15:16:23 crc kubenswrapper[4771]: I0319 15:16:23.242744 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:23 crc kubenswrapper[4771]: I0319 15:16:23.244600 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:23 crc kubenswrapper[4771]: I0319 15:16:23.244655 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:23 crc kubenswrapper[4771]: I0319 15:16:23.244679 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:23 crc kubenswrapper[4771]: I0319 15:16:23.244721 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 15:16:23 crc kubenswrapper[4771]: E0319 15:16:23.249413 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:23Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 15:16:23 crc kubenswrapper[4771]: E0319 15:16:23.253893 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:23Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 15:16:23 crc kubenswrapper[4771]: I0319 15:16:23.410117 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 15:16:23 crc kubenswrapper[4771]: I0319 15:16:23.410358 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:23 crc kubenswrapper[4771]: I0319 15:16:23.411806 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:23 crc kubenswrapper[4771]: I0319 15:16:23.411859 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:23 crc kubenswrapper[4771]: I0319 15:16:23.411884 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:23 crc kubenswrapper[4771]: I0319 15:16:23.440095 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:23Z is after 2026-02-23T05:33:13Z Mar 19 15:16:23 crc kubenswrapper[4771]: W0319 15:16:23.762845 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:23Z is after 2026-02-23T05:33:13Z Mar 19 15:16:23 crc kubenswrapper[4771]: E0319 15:16:23.762947 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:23Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 15:16:24 crc kubenswrapper[4771]: I0319 15:16:24.145510 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 15:16:24 crc kubenswrapper[4771]: I0319 15:16:24.145723 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:24 crc kubenswrapper[4771]: I0319 15:16:24.147212 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:24 crc kubenswrapper[4771]: I0319 15:16:24.147278 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:24 crc kubenswrapper[4771]: I0319 15:16:24.147296 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:24 crc kubenswrapper[4771]: I0319 15:16:24.438018 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:24Z is after 2026-02-23T05:33:13Z Mar 19 15:16:24 crc kubenswrapper[4771]: W0319 15:16:24.486048 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:24Z is after 2026-02-23T05:33:13Z Mar 19 15:16:24 crc kubenswrapper[4771]: E0319 15:16:24.486110 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:24Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 15:16:25 crc kubenswrapper[4771]: E0319 15:16:25.319414 4771 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:25Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e46f9af5bf5e7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.433394663 +0000 UTC m=+0.662015905,LastTimestamp:2026-03-19 15:15:41.433394663 +0000 UTC m=+0.662015905,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:25 crc kubenswrapper[4771]: I0319 15:16:25.441486 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:25 crc kubenswrapper[4771]: I0319 15:16:25.625202 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:16:25 crc kubenswrapper[4771]: I0319 15:16:25.625664 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:25 crc kubenswrapper[4771]: I0319 15:16:25.627294 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:25 crc kubenswrapper[4771]: I0319 15:16:25.627359 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:25 crc kubenswrapper[4771]: I0319 15:16:25.627377 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:25 crc kubenswrapper[4771]: I0319 15:16:25.628440 4771 scope.go:117] "RemoveContainer" containerID="3c33f52b2cefe64aed6b27b5c2633a731ed17518bd9e9a377c4dfa8bd666fa26" Mar 19 15:16:25 crc kubenswrapper[4771]: E0319 15:16:25.628725 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 15:16:25 crc kubenswrapper[4771]: W0319 15:16:25.843856 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 19 15:16:25 crc kubenswrapper[4771]: E0319 15:16:25.843929 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 19 15:16:26 crc kubenswrapper[4771]: I0319 15:16:26.333411 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:16:26 crc kubenswrapper[4771]: I0319 15:16:26.333659 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:26 crc kubenswrapper[4771]: I0319 15:16:26.335204 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:26 crc kubenswrapper[4771]: I0319 15:16:26.335297 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:26 crc kubenswrapper[4771]: I0319 15:16:26.335349 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:26 crc kubenswrapper[4771]: I0319 15:16:26.336117 4771 scope.go:117] "RemoveContainer" containerID="3c33f52b2cefe64aed6b27b5c2633a731ed17518bd9e9a377c4dfa8bd666fa26" Mar 19 15:16:26 crc kubenswrapper[4771]: E0319 15:16:26.336391 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 15:16:26 crc kubenswrapper[4771]: I0319 15:16:26.410425 4771 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 15:16:26 crc kubenswrapper[4771]: I0319 15:16:26.410582 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 15:16:26 crc kubenswrapper[4771]: I0319 15:16:26.445770 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:27 crc kubenswrapper[4771]: I0319 15:16:27.441767 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:28 crc kubenswrapper[4771]: I0319 15:16:28.442087 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:29 crc kubenswrapper[4771]: W0319 15:16:29.370667 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:29 crc kubenswrapper[4771]: E0319 15:16:29.370732 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 19 15:16:29 crc kubenswrapper[4771]: I0319 15:16:29.441209 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:30 crc kubenswrapper[4771]: I0319 15:16:30.255458 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:30 crc kubenswrapper[4771]: E0319 15:16:30.256616 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 15:16:30 crc kubenswrapper[4771]: I0319 15:16:30.257687 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:30 crc kubenswrapper[4771]: I0319 15:16:30.257720 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:30 crc kubenswrapper[4771]: I0319 15:16:30.257728 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:30 crc kubenswrapper[4771]: I0319 15:16:30.257748 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 15:16:30 crc kubenswrapper[4771]: E0319 15:16:30.263960 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 15:16:30 crc kubenswrapper[4771]: I0319 15:16:30.441353 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:31 crc kubenswrapper[4771]: I0319 15:16:31.442463 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:31 crc kubenswrapper[4771]: E0319 15:16:31.614481 4771 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 15:16:32 crc kubenswrapper[4771]: I0319 15:16:32.441570 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:33 crc kubenswrapper[4771]: I0319 15:16:33.441704 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:34 crc kubenswrapper[4771]: I0319 15:16:34.440317 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:35 crc kubenswrapper[4771]: I0319 15:16:35.064495 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 15:16:35 crc kubenswrapper[4771]: I0319 15:16:35.064650 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:35 crc kubenswrapper[4771]: I0319 15:16:35.066262 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:35 crc kubenswrapper[4771]: I0319 15:16:35.066492 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:35 crc kubenswrapper[4771]: I0319 15:16:35.066643 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:35 crc kubenswrapper[4771]: I0319 15:16:35.073754 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.327566 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e46f9af5bf5e7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.433394663 +0000 UTC m=+0.662015905,LastTimestamp:2026-03-19 15:15:41.433394663 +0000 UTC m=+0.662015905,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.335303 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e46f9b30e757a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.495424378 +0000 UTC m=+0.724045570,LastTimestamp:2026-03-19 15:15:41.495424378 +0000 UTC m=+0.724045570,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.341880 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e46f9b30ec43b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.495444539 +0000 UTC m=+0.724065741,LastTimestamp:2026-03-19 15:15:41.495444539 +0000 UTC m=+0.724065741,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.349249 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e46f9b30ee3a3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.495452579 +0000 UTC m=+0.724073781,LastTimestamp:2026-03-19 15:15:41.495452579 +0000 UTC m=+0.724073781,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.355648 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e46f9b909b96f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.595777391 +0000 UTC m=+0.824398633,LastTimestamp:2026-03-19 15:15:41.595777391 +0000 UTC m=+0.824398633,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.362952 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e46f9b30e757a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e46f9b30e757a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.495424378 +0000 UTC m=+0.724045570,LastTimestamp:2026-03-19 15:15:41.610029245 +0000 UTC m=+0.838650487,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.370072 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e46f9b30ec43b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e46f9b30ec43b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.495444539 +0000 UTC m=+0.724065741,LastTimestamp:2026-03-19 15:15:41.610067266 +0000 UTC m=+0.838688508,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.377513 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e46f9b30ee3a3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e46f9b30ee3a3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.495452579 +0000 UTC m=+0.724073781,LastTimestamp:2026-03-19 15:15:41.610085376 +0000 UTC m=+0.838706618,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.384315 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e46f9b30e757a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e46f9b30e757a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.495424378 +0000 UTC m=+0.724045570,LastTimestamp:2026-03-19 15:15:41.611866168 +0000 UTC m=+0.840487410,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.390929 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e46f9b30ec43b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e46f9b30ec43b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.495444539 +0000 UTC m=+0.724065741,LastTimestamp:2026-03-19 15:15:41.611899129 +0000 UTC m=+0.840520371,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.397439 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e46f9b30ee3a3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e46f9b30ee3a3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.495452579 +0000 UTC m=+0.724073781,LastTimestamp:2026-03-19 15:15:41.611923939 +0000 UTC m=+0.840545171,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.404382 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e46f9b30e757a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e46f9b30e757a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.495424378 +0000 UTC m=+0.724045570,LastTimestamp:2026-03-19 15:15:41.612174375 +0000 UTC m=+0.840795607,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.411682 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e46f9b30ec43b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e46f9b30ec43b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.495444539 +0000 UTC m=+0.724065741,LastTimestamp:2026-03-19 15:15:41.612194345 +0000 UTC m=+0.840815587,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.418701 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e46f9b30ee3a3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e46f9b30ee3a3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.495452579 +0000 UTC m=+0.724073781,LastTimestamp:2026-03-19 15:15:41.612211336 +0000 UTC m=+0.840832568,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.423209 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e46f9b30e757a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e46f9b30e757a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.495424378 +0000 UTC m=+0.724045570,LastTimestamp:2026-03-19 15:15:41.613893876 +0000 UTC m=+0.842515108,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.425804 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e46f9b30ec43b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e46f9b30ec43b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.495444539 +0000 UTC m=+0.724065741,LastTimestamp:2026-03-19 15:15:41.613925217 +0000 UTC m=+0.842546449,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.432530 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e46f9b30ee3a3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e46f9b30ee3a3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.495452579 +0000 UTC m=+0.724073781,LastTimestamp:2026-03-19 15:15:41.613941647 +0000 UTC m=+0.842562889,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: I0319 15:16:35.439600 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.440199 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e46f9b30e757a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e46f9b30e757a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.495424378 +0000 UTC m=+0.724045570,LastTimestamp:2026-03-19 15:15:41.615505013 +0000 UTC m=+0.844126225,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.446505 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e46f9b30ec43b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e46f9b30ec43b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.495444539 +0000 UTC m=+0.724065741,LastTimestamp:2026-03-19 15:15:41.615534724 +0000 UTC m=+0.844155936,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.457176 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e46f9b30ee3a3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e46f9b30ee3a3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.495452579 +0000 UTC m=+0.724073781,LastTimestamp:2026-03-19 15:15:41.615548944 +0000 UTC m=+0.844170156,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.463896 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e46f9b30e757a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e46f9b30e757a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.495424378 +0000 UTC m=+0.724045570,LastTimestamp:2026-03-19 15:15:41.617225774 +0000 UTC m=+0.845846986,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.470941 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e46f9b30ec43b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e46f9b30ec43b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.495444539 +0000 UTC m=+0.724065741,LastTimestamp:2026-03-19 15:15:41.617249734 +0000 UTC m=+0.845870946,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.477638 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e46f9b30ee3a3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e46f9b30ee3a3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.495452579 +0000 UTC m=+0.724073781,LastTimestamp:2026-03-19 15:15:41.617283985 +0000 UTC m=+0.845905197,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.499768 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e46f9b30e757a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e46f9b30e757a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.495424378 +0000 UTC m=+0.724045570,LastTimestamp:2026-03-19 15:15:41.617942051 +0000 UTC m=+0.846563263,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.518115 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e46f9b30ec43b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e46f9b30ec43b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:41.495444539 +0000 UTC m=+0.724065741,LastTimestamp:2026-03-19 15:15:41.617959391 +0000 UTC m=+0.846580603,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.529841 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e46f9d2f62d68 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:42.030703976 +0000 UTC m=+1.259325218,LastTimestamp:2026-03-19 15:15:42.030703976 +0000 UTC m=+1.259325218,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.536358 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e46f9d2f9dae0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:42.030944992 +0000 UTC m=+1.259566224,LastTimestamp:2026-03-19 15:15:42.030944992 +0000 UTC m=+1.259566224,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.537949 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e46f9d3e23fca openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:42.046175178 +0000 UTC m=+1.274796420,LastTimestamp:2026-03-19 15:15:42.046175178 +0000 UTC m=+1.274796420,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.544545 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e46f9d5d95812 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:42.079146002 +0000 UTC m=+1.307767244,LastTimestamp:2026-03-19 15:15:42.079146002 +0000 UTC m=+1.307767244,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.553313 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e46f9d697d94d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:42.091630925 +0000 UTC m=+1.320252167,LastTimestamp:2026-03-19 15:15:42.091630925 +0000 UTC m=+1.320252167,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.555045 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e46f9fcf7a223 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:42.735442467 +0000 UTC m=+1.964063699,LastTimestamp:2026-03-19 15:15:42.735442467 +0000 UTC m=+1.964063699,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.559430 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e46f9fcf8cffa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:42.735519738 +0000 UTC m=+1.964140980,LastTimestamp:2026-03-19 15:15:42.735519738 +0000 UTC m=+1.964140980,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.565738 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e46f9fcf9a15c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:42.73557334 +0000 UTC m=+1.964194592,LastTimestamp:2026-03-19 15:15:42.73557334 +0000 UTC m=+1.964194592,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.571799 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e46f9fd5ab42c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:42.741935148 +0000 UTC m=+1.970556380,LastTimestamp:2026-03-19 15:15:42.741935148 +0000 UTC m=+1.970556380,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.577367 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e46f9fd688c48 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:42.74284244 +0000 UTC m=+1.971463682,LastTimestamp:2026-03-19 15:15:42.74284244 +0000 UTC m=+1.971463682,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.581137 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e46f9fe05c18f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:42.753145231 +0000 UTC m=+1.981766473,LastTimestamp:2026-03-19 15:15:42.753145231 +0000 UTC m=+1.981766473,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.584920 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e46f9fe132ef2 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:42.754025202 +0000 UTC m=+1.982646434,LastTimestamp:2026-03-19 15:15:42.754025202 +0000 UTC m=+1.982646434,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.591763 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e46f9fe3ac5d5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:42.756619733 +0000 UTC m=+1.985240975,LastTimestamp:2026-03-19 15:15:42.756619733 +0000 UTC m=+1.985240975,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.600625 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e46f9fe7aa9eb openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:42.760806891 +0000 UTC m=+1.989428123,LastTimestamp:2026-03-19 15:15:42.760806891 +0000 UTC m=+1.989428123,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.605023 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e46f9feb18e83 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:42.764404355 +0000 UTC m=+1.993025577,LastTimestamp:2026-03-19 15:15:42.764404355 +0000 UTC m=+1.993025577,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.611125 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e46f9fec6400f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:42.765760527 +0000 UTC m=+1.994381739,LastTimestamp:2026-03-19 15:15:42.765760527 +0000 UTC m=+1.994381739,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.621259 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e46fa16079fdf openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:43.155920863 +0000 UTC m=+2.384542095,LastTimestamp:2026-03-19 15:15:43.155920863 +0000 UTC m=+2.384542095,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.627395 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e46fa16c64c1c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:43.168416796 +0000 UTC m=+2.397038028,LastTimestamp:2026-03-19 15:15:43.168416796 +0000 UTC m=+2.397038028,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.634872 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e46fa16e07e5c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:43.170133596 +0000 UTC m=+2.398754838,LastTimestamp:2026-03-19 15:15:43.170133596 +0000 UTC m=+2.398754838,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.636323 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e46fa24c6fbfb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:43.403342843 +0000 UTC m=+2.631964085,LastTimestamp:2026-03-19 15:15:43.403342843 +0000 UTC m=+2.631964085,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.639836 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e46fa2554a322 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:43.41262621 +0000 UTC m=+2.641247412,LastTimestamp:2026-03-19 15:15:43.41262621 +0000 UTC m=+2.641247412,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.643213 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e46fa256d54ce openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:43.414244558 +0000 UTC m=+2.642865800,LastTimestamp:2026-03-19 15:15:43.414244558 +0000 UTC m=+2.642865800,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.649046 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e46fa2c4c6bd8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:43.52952828 +0000 UTC m=+2.758149522,LastTimestamp:2026-03-19 15:15:43.52952828 +0000 UTC m=+2.758149522,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.653038 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e46fa2c954d2b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:43.534304555 +0000 UTC m=+2.762925797,LastTimestamp:2026-03-19 15:15:43.534304555 +0000 UTC m=+2.762925797,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.659145 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e46fa2d0b8a93 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:43.542053523 +0000 UTC m=+2.770674735,LastTimestamp:2026-03-19 15:15:43.542053523 +0000 UTC m=+2.770674735,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.663827 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e46fa2d5f2bce openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:43.547534286 +0000 UTC m=+2.776155528,LastTimestamp:2026-03-19 15:15:43.547534286 +0000 UTC m=+2.776155528,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.670181 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e46fa34db396a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:43.673104746 +0000 UTC m=+2.901725948,LastTimestamp:2026-03-19 15:15:43.673104746 +0000 UTC m=+2.901725948,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.676460 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e46fa3595c51a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:43.685330202 +0000 UTC m=+2.913951404,LastTimestamp:2026-03-19 15:15:43.685330202 +0000 UTC m=+2.913951404,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.680392 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e46fa3aead0ce openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:43.774789838 +0000 UTC m=+3.003411050,LastTimestamp:2026-03-19 15:15:43.774789838 +0000 UTC m=+3.003411050,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.686742 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e46fa3b11d45a openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:43.77734665 +0000 UTC m=+3.005967852,LastTimestamp:2026-03-19 15:15:43.77734665 +0000 UTC m=+3.005967852,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.690482 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e46fa3b131b14 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:43.777430292 +0000 UTC m=+3.006051504,LastTimestamp:2026-03-19 15:15:43.777430292 +0000 UTC m=+3.006051504,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.694582 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e46fa3bbafacf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:43.788432079 +0000 UTC m=+3.017053281,LastTimestamp:2026-03-19 15:15:43.788432079 +0000 UTC m=+3.017053281,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.699423 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e46fa3c2dd385 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:43.795958661 +0000 UTC m=+3.024579873,LastTimestamp:2026-03-19 15:15:43.795958661 +0000 UTC m=+3.024579873,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.704289 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e46fa3c2dd3d5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:43.795958741 +0000 UTC m=+3.024579973,LastTimestamp:2026-03-19 15:15:43.795958741 +0000 UTC m=+3.024579973,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.712673 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e46fa3c391063 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:43.796695139 +0000 UTC m=+3.025316351,LastTimestamp:2026-03-19 15:15:43.796695139 +0000 UTC m=+3.025316351,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.720918 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e46fa3c44ffb1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:43.797477297 +0000 UTC m=+3.026098509,LastTimestamp:2026-03-19 15:15:43.797477297 +0000 UTC m=+3.026098509,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.728331 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e46fa3ca76310 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:43.803925264 +0000 UTC m=+3.032546476,LastTimestamp:2026-03-19 15:15:43.803925264 +0000 UTC m=+3.032546476,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.733796 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e46fa3ce3d3c8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:43.80788628 +0000 UTC m=+3.036507522,LastTimestamp:2026-03-19 15:15:43.80788628 +0000 UTC m=+3.036507522,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.738754 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e46fa48f97e68 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:44.010632808 +0000 UTC m=+3.239254010,LastTimestamp:2026-03-19 15:15:44.010632808 +0000 UTC m=+3.239254010,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.744266 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e46fa48fa5857 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:44.010688599 +0000 UTC m=+3.239309801,LastTimestamp:2026-03-19 15:15:44.010688599 +0000 UTC m=+3.239309801,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.750177 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e46fa49ea66ff openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:44.026420991 +0000 UTC m=+3.255042213,LastTimestamp:2026-03-19 15:15:44.026420991 +0000 UTC m=+3.255042213,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.754259 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e46fa49feabfd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:44.027749373 +0000 UTC m=+3.256370595,LastTimestamp:2026-03-19 15:15:44.027749373 +0000 UTC m=+3.256370595,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.759390 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e46fa4a20504e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:44.029954126 +0000 UTC m=+3.258575328,LastTimestamp:2026-03-19 15:15:44.029954126 +0000 UTC m=+3.258575328,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.763108 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e46fa4a2cc96e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:44.030771566 +0000 UTC m=+3.259392768,LastTimestamp:2026-03-19 15:15:44.030771566 +0000 UTC m=+3.259392768,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.769214 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e46fa558804d9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:44.221299929 +0000 UTC m=+3.449921131,LastTimestamp:2026-03-19 15:15:44.221299929 +0000 UTC m=+3.449921131,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.774456 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e46fa562bfd88 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:44.23204596 +0000 UTC m=+3.460667162,LastTimestamp:2026-03-19 15:15:44.23204596 +0000 UTC m=+3.460667162,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.778133 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e46fa5646bbde openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:44.233798622 +0000 UTC m=+3.462419854,LastTimestamp:2026-03-19 15:15:44.233798622 +0000 UTC m=+3.462419854,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.781949 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e46fa58a63993 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:44.273611155 +0000 UTC m=+3.502232367,LastTimestamp:2026-03-19 15:15:44.273611155 +0000 UTC m=+3.502232367,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: I0319 15:16:35.786782 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.786749 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e46fa5971d9df openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:44.286955999 +0000 UTC m=+3.515577201,LastTimestamp:2026-03-19 15:15:44.286955999 +0000 UTC m=+3.515577201,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: I0319 15:16:35.787698 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:35 crc kubenswrapper[4771]: I0319 15:16:35.787736 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:35 crc kubenswrapper[4771]: I0319 15:16:35.787749 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.792074 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e46fa5ff94add openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:44.396495581 +0000 UTC m=+3.625116783,LastTimestamp:2026-03-19 15:15:44.396495581 +0000 UTC m=+3.625116783,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.795900 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e46fa609f7ae7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:44.407386855 +0000 UTC m=+3.636008067,LastTimestamp:2026-03-19 15:15:44.407386855 +0000 UTC m=+3.636008067,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.799432 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e46fa60b049dc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:44.408488412 +0000 UTC m=+3.637109624,LastTimestamp:2026-03-19 15:15:44.408488412 +0000 UTC m=+3.637109624,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.803123 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e46fa69f8474b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:44.564201291 +0000 UTC m=+3.792822493,LastTimestamp:2026-03-19 15:15:44.564201291 +0000 UTC m=+3.792822493,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.808688 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e46fa6c87ed13 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:44.607169811 +0000 UTC m=+3.835791013,LastTimestamp:2026-03-19 15:15:44.607169811 +0000 UTC m=+3.835791013,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.812908 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e46fa6e08aa0c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:44.632384012 +0000 UTC m=+3.861005244,LastTimestamp:2026-03-19 15:15:44.632384012 +0000 UTC m=+3.861005244,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.818626 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e46fa75e5529c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:44.764285596 +0000 UTC m=+3.992906828,LastTimestamp:2026-03-19 15:15:44.764285596 +0000 UTC m=+3.992906828,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.824275 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e46fa76c7145a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:44.779080794 +0000 UTC m=+4.007702006,LastTimestamp:2026-03-19 15:15:44.779080794 +0000 UTC m=+4.007702006,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.829146 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e46faa7081471 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:45.588647025 +0000 UTC m=+4.817268257,LastTimestamp:2026-03-19 15:15:45.588647025 +0000 UTC m=+4.817268257,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.834553 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e46fab4f388cb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:45.822181579 +0000 UTC m=+5.050802851,LastTimestamp:2026-03-19 15:15:45.822181579 +0000 UTC m=+5.050802851,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.838053 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e46fab5b28ce2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:45.834700002 +0000 UTC m=+5.063321244,LastTimestamp:2026-03-19 15:15:45.834700002 +0000 UTC m=+5.063321244,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.843998 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e46fab5ca9e3e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:45.83627731 +0000 UTC m=+5.064898542,LastTimestamp:2026-03-19 15:15:45.83627731 +0000 UTC m=+5.064898542,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.847407 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e46fac5c39dc1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:46.104253889 +0000 UTC m=+5.332875121,LastTimestamp:2026-03-19 15:15:46.104253889 +0000 UTC m=+5.332875121,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.848114 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e46fac6af9e6a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:46.119720554 +0000 UTC m=+5.348341786,LastTimestamp:2026-03-19 15:15:46.119720554 +0000 UTC m=+5.348341786,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.851173 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e46fac6c3acb8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:46.121034936 +0000 UTC m=+5.349656178,LastTimestamp:2026-03-19 15:15:46.121034936 +0000 UTC m=+5.349656178,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.854393 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e46fad6e88061 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:46.391883873 +0000 UTC m=+5.620505105,LastTimestamp:2026-03-19 15:15:46.391883873 +0000 UTC m=+5.620505105,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.857858 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e46fad7c539c5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:46.406349253 +0000 UTC m=+5.634970485,LastTimestamp:2026-03-19 15:15:46.406349253 +0000 UTC m=+5.634970485,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.864106 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e46fad7dc9a37 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:46.407881271 +0000 UTC m=+5.636502513,LastTimestamp:2026-03-19 15:15:46.407881271 +0000 UTC m=+5.636502513,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.868219 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e46fae6ec2b52 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:46.660559698 +0000 UTC m=+5.889180930,LastTimestamp:2026-03-19 15:15:46.660559698 +0000 UTC m=+5.889180930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.875481 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e46fae801eaeb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:46.678762219 +0000 UTC m=+5.907383421,LastTimestamp:2026-03-19 15:15:46.678762219 +0000 UTC m=+5.907383421,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.881315 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e46fae81d16de openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:46.680542942 +0000 UTC m=+5.909164184,LastTimestamp:2026-03-19 15:15:46.680542942 +0000 UTC m=+5.909164184,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.887711 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e46faf7a81030 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:46.941308976 +0000 UTC m=+6.169930208,LastTimestamp:2026-03-19 15:15:46.941308976 +0000 UTC m=+6.169930208,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.894559 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e46faf893c0e2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:46.95675517 +0000 UTC m=+6.185376402,LastTimestamp:2026-03-19 15:15:46.95675517 +0000 UTC m=+6.185376402,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.904049 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 19 15:16:35 crc kubenswrapper[4771]: &Event{ObjectMeta:{kube-apiserver-crc.189e46fd08f1f27a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 19 15:16:35 crc kubenswrapper[4771]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 15:16:35 crc kubenswrapper[4771]: Mar 19 15:16:35 crc kubenswrapper[4771]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:55.821298298 +0000 UTC m=+15.049919550,LastTimestamp:2026-03-19 15:15:55.821298298 +0000 UTC m=+15.049919550,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 15:16:35 crc kubenswrapper[4771]: > Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.911197 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e46fd08f33d6c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:55.82138302 +0000 UTC m=+15.050004262,LastTimestamp:2026-03-19 15:15:55.82138302 +0000 UTC m=+15.050004262,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.915387 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e46fd08f1f27a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 19 15:16:35 crc kubenswrapper[4771]: &Event{ObjectMeta:{kube-apiserver-crc.189e46fd08f1f27a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 19 15:16:35 crc kubenswrapper[4771]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 15:16:35 crc kubenswrapper[4771]: Mar 19 15:16:35 crc kubenswrapper[4771]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:55.821298298 +0000 UTC m=+15.049919550,LastTimestamp:2026-03-19 15:15:55.839612111 +0000 UTC m=+15.068233353,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 15:16:35 crc kubenswrapper[4771]: > Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.921925 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e46fd08f33d6c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e46fd08f33d6c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:55.82138302 +0000 UTC m=+15.050004262,LastTimestamp:2026-03-19 15:15:55.839667753 +0000 UTC m=+15.068288995,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.926082 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 19 15:16:35 crc kubenswrapper[4771]: &Event{ObjectMeta:{kube-apiserver-crc.189e46fd0b275b63 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": read tcp 192.168.126.11:58864->192.168.126.11:17697: read: connection reset by peer Mar 19 15:16:35 crc kubenswrapper[4771]: body: Mar 19 15:16:35 crc kubenswrapper[4771]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:55.858352995 +0000 UTC m=+15.086974187,LastTimestamp:2026-03-19 15:15:55.858352995 +0000 UTC m=+15.086974187,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 15:16:35 crc kubenswrapper[4771]: > Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.932680 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e46fd0b27f588 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:58864->192.168.126.11:17697: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:55.858392456 +0000 UTC m=+15.087013658,LastTimestamp:2026-03-19 15:15:55.858392456 +0000 UTC m=+15.087013658,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.937637 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 19 15:16:35 crc kubenswrapper[4771]: &Event{ObjectMeta:{kube-apiserver-crc.189e46fd2785d897 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Mar 19 15:16:35 crc kubenswrapper[4771]: body: Mar 19 15:16:35 crc kubenswrapper[4771]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:56.334307479 +0000 UTC m=+15.562928721,LastTimestamp:2026-03-19 15:15:56.334307479 +0000 UTC m=+15.562928721,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 15:16:35 crc kubenswrapper[4771]: > Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.943761 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 15:16:35 crc kubenswrapper[4771]: &Event{ObjectMeta:{kube-controller-manager-crc.189e46fd2c0a2507 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 19 15:16:35 crc kubenswrapper[4771]: body: Mar 19 15:16:35 crc kubenswrapper[4771]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:56.410086663 +0000 UTC m=+15.638707905,LastTimestamp:2026-03-19 15:15:56.410086663 +0000 UTC m=+15.638707905,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 15:16:35 crc kubenswrapper[4771]: > Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.950927 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e46fd2c0b8395 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:56.410176405 +0000 UTC m=+15.638797647,LastTimestamp:2026-03-19 15:15:56.410176405 +0000 UTC m=+15.638797647,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.960236 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e46fd2c0a2507\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 15:16:35 crc kubenswrapper[4771]: &Event{ObjectMeta:{kube-controller-manager-crc.189e46fd2c0a2507 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 19 15:16:35 crc kubenswrapper[4771]: body: Mar 19 15:16:35 crc kubenswrapper[4771]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:56.410086663 +0000 UTC m=+15.638707905,LastTimestamp:2026-03-19 15:16:06.411419306 +0000 UTC m=+25.640040548,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 15:16:35 crc kubenswrapper[4771]: > Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.966520 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e46fd2c0b8395\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e46fd2c0b8395 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:56.410176405 +0000 UTC m=+15.638797647,LastTimestamp:2026-03-19 15:16:06.411621352 +0000 UTC m=+25.640242604,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.970670 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 15:16:35 crc kubenswrapper[4771]: &Event{ObjectMeta:{kube-controller-manager-crc.189e47014c5d147e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:53904->192.168.126.11:10357: read: connection reset by peer Mar 19 15:16:35 crc kubenswrapper[4771]: body: Mar 19 15:16:35 crc kubenswrapper[4771]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:16:14.132262014 +0000 UTC m=+33.360883256,LastTimestamp:2026-03-19 15:16:14.132262014 +0000 UTC m=+33.360883256,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 15:16:35 crc kubenswrapper[4771]: > Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.977336 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e47014c5df2a5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:53904->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:16:14.132318885 +0000 UTC m=+33.360940117,LastTimestamp:2026-03-19 15:16:14.132318885 +0000 UTC m=+33.360940117,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.983716 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e47014c8870f5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:16:14.135103733 +0000 UTC m=+33.363724975,LastTimestamp:2026-03-19 15:16:14.135103733 +0000 UTC m=+33.363724975,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.987977 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e46f9fe3ac5d5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e46f9fe3ac5d5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:42.756619733 +0000 UTC m=+1.985240975,LastTimestamp:2026-03-19 15:16:14.158749385 +0000 UTC m=+33.387370627,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:35 crc kubenswrapper[4771]: E0319 15:16:35.992821 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e46fa16079fdf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e46fa16079fdf openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:43.155920863 +0000 UTC m=+2.384542095,LastTimestamp:2026-03-19 15:16:14.416594968 +0000 UTC m=+33.645216200,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:36 crc kubenswrapper[4771]: E0319 15:16:35.999793 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e46fa16c64c1c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e46fa16c64c1c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:43.168416796 +0000 UTC m=+2.397038028,LastTimestamp:2026-03-19 15:16:14.428503596 +0000 UTC m=+33.657124828,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:36 crc kubenswrapper[4771]: E0319 15:16:36.008939 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e46fd2c0a2507\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 15:16:36 crc kubenswrapper[4771]: &Event{ObjectMeta:{kube-controller-manager-crc.189e46fd2c0a2507 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 19 15:16:36 crc kubenswrapper[4771]: body: Mar 19 15:16:36 crc kubenswrapper[4771]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:56.410086663 +0000 UTC m=+15.638707905,LastTimestamp:2026-03-19 15:16:26.410538494 +0000 UTC m=+45.639159736,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 15:16:36 crc kubenswrapper[4771]: > Mar 19 15:16:36 crc kubenswrapper[4771]: E0319 15:16:36.015035 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e46fd2c0b8395\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e46fd2c0b8395 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:15:56.410176405 +0000 UTC m=+15.638797647,LastTimestamp:2026-03-19 15:16:26.410632386 +0000 UTC m=+45.639253658,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:16:36 crc kubenswrapper[4771]: I0319 15:16:36.441292 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:36 crc kubenswrapper[4771]: I0319 15:16:36.724227 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 15:16:36 crc kubenswrapper[4771]: I0319 15:16:36.724440 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:36 crc kubenswrapper[4771]: I0319 15:16:36.725838 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:36 crc kubenswrapper[4771]: I0319 15:16:36.725897 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:36 crc kubenswrapper[4771]: I0319 15:16:36.725920 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:37 crc kubenswrapper[4771]: I0319 15:16:37.264651 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:37 crc kubenswrapper[4771]: E0319 15:16:37.265245 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 15:16:37 crc kubenswrapper[4771]: I0319 15:16:37.266441 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:37 crc kubenswrapper[4771]: I0319 15:16:37.266512 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:37 crc kubenswrapper[4771]: I0319 15:16:37.266537 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:37 crc kubenswrapper[4771]: I0319 15:16:37.266582 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 15:16:37 crc kubenswrapper[4771]: E0319 15:16:37.274544 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 15:16:37 crc kubenswrapper[4771]: I0319 15:16:37.441651 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:38 crc kubenswrapper[4771]: I0319 15:16:38.439705 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:39 crc kubenswrapper[4771]: I0319 15:16:39.444820 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:40 crc kubenswrapper[4771]: I0319 15:16:40.438912 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:40 crc kubenswrapper[4771]: I0319 15:16:40.508247 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:40 crc kubenswrapper[4771]: I0319 15:16:40.509409 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:40 crc kubenswrapper[4771]: I0319 15:16:40.509462 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:40 crc kubenswrapper[4771]: I0319 15:16:40.509475 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:40 crc kubenswrapper[4771]: I0319 15:16:40.510093 4771 scope.go:117] "RemoveContainer" containerID="3c33f52b2cefe64aed6b27b5c2633a731ed17518bd9e9a377c4dfa8bd666fa26" Mar 19 15:16:40 crc kubenswrapper[4771]: I0319 15:16:40.800791 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 15:16:40 crc kubenswrapper[4771]: I0319 15:16:40.802556 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6"} Mar 19 15:16:40 crc kubenswrapper[4771]: I0319 15:16:40.802701 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:40 crc kubenswrapper[4771]: I0319 15:16:40.803560 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:40 crc kubenswrapper[4771]: I0319 15:16:40.803610 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:40 crc kubenswrapper[4771]: I0319 15:16:40.803627 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:41 crc kubenswrapper[4771]: I0319 15:16:41.447498 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:41 crc kubenswrapper[4771]: E0319 15:16:41.614804 4771 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 15:16:41 crc kubenswrapper[4771]: I0319 15:16:41.807035 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 15:16:41 crc kubenswrapper[4771]: I0319 15:16:41.807686 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 15:16:41 crc kubenswrapper[4771]: I0319 15:16:41.809912 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6"} Mar 19 15:16:41 crc kubenswrapper[4771]: I0319 15:16:41.810004 4771 scope.go:117] "RemoveContainer" containerID="3c33f52b2cefe64aed6b27b5c2633a731ed17518bd9e9a377c4dfa8bd666fa26" Mar 19 15:16:41 crc kubenswrapper[4771]: I0319 15:16:41.809913 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6" exitCode=255 Mar 19 15:16:41 crc kubenswrapper[4771]: I0319 15:16:41.810179 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:41 crc kubenswrapper[4771]: I0319 15:16:41.811276 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:41 crc kubenswrapper[4771]: I0319 15:16:41.811313 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:41 crc kubenswrapper[4771]: I0319 15:16:41.811324 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:41 crc kubenswrapper[4771]: I0319 15:16:41.811892 4771 scope.go:117] "RemoveContainer" containerID="efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6" Mar 19 15:16:41 crc kubenswrapper[4771]: E0319 15:16:41.812129 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 15:16:42 crc kubenswrapper[4771]: I0319 15:16:42.439091 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:42 crc kubenswrapper[4771]: I0319 15:16:42.814510 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 15:16:43 crc kubenswrapper[4771]: I0319 15:16:43.441795 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:44 crc kubenswrapper[4771]: E0319 15:16:44.270356 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 15:16:44 crc kubenswrapper[4771]: I0319 15:16:44.275403 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:44 crc kubenswrapper[4771]: I0319 15:16:44.276562 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:44 crc kubenswrapper[4771]: I0319 15:16:44.276602 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:44 crc kubenswrapper[4771]: I0319 15:16:44.276614 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:44 crc kubenswrapper[4771]: I0319 15:16:44.276643 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 15:16:44 crc kubenswrapper[4771]: E0319 15:16:44.281264 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 15:16:44 crc kubenswrapper[4771]: I0319 15:16:44.438873 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:45 crc kubenswrapper[4771]: I0319 15:16:45.439292 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:45 crc kubenswrapper[4771]: I0319 15:16:45.624977 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:16:45 crc kubenswrapper[4771]: I0319 15:16:45.625282 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:45 crc kubenswrapper[4771]: I0319 15:16:45.626688 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:45 crc kubenswrapper[4771]: I0319 15:16:45.626746 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:45 crc kubenswrapper[4771]: I0319 15:16:45.626771 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:45 crc kubenswrapper[4771]: I0319 15:16:45.627635 4771 scope.go:117] "RemoveContainer" containerID="efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6" Mar 19 15:16:45 crc kubenswrapper[4771]: E0319 15:16:45.627905 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 15:16:46 crc kubenswrapper[4771]: I0319 15:16:46.333523 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:16:46 crc kubenswrapper[4771]: I0319 15:16:46.333644 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:46 crc kubenswrapper[4771]: I0319 15:16:46.335164 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:46 crc kubenswrapper[4771]: I0319 15:16:46.335247 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:46 crc kubenswrapper[4771]: I0319 15:16:46.335287 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:46 crc kubenswrapper[4771]: I0319 15:16:46.336201 4771 scope.go:117] "RemoveContainer" containerID="efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6" Mar 19 15:16:46 crc kubenswrapper[4771]: E0319 15:16:46.336533 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 15:16:46 crc kubenswrapper[4771]: I0319 15:16:46.442788 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:47 crc kubenswrapper[4771]: I0319 15:16:47.437886 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:48 crc kubenswrapper[4771]: I0319 15:16:48.441514 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:49 crc kubenswrapper[4771]: I0319 15:16:49.438301 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:50 crc kubenswrapper[4771]: I0319 15:16:50.439474 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:51 crc kubenswrapper[4771]: E0319 15:16:51.274784 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 15:16:51 crc kubenswrapper[4771]: I0319 15:16:51.282175 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:51 crc kubenswrapper[4771]: I0319 15:16:51.283428 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:51 crc kubenswrapper[4771]: I0319 15:16:51.283522 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:51 crc kubenswrapper[4771]: I0319 15:16:51.283586 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:51 crc kubenswrapper[4771]: I0319 15:16:51.283663 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 15:16:51 crc kubenswrapper[4771]: E0319 15:16:51.287126 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 15:16:51 crc kubenswrapper[4771]: I0319 15:16:51.439212 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:51 crc kubenswrapper[4771]: E0319 15:16:51.615291 4771 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 15:16:52 crc kubenswrapper[4771]: I0319 15:16:52.369168 4771 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 15:16:52 crc kubenswrapper[4771]: I0319 15:16:52.387560 4771 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 19 15:16:52 crc kubenswrapper[4771]: I0319 15:16:52.438015 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:53 crc kubenswrapper[4771]: I0319 15:16:53.441144 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:54 crc kubenswrapper[4771]: I0319 15:16:54.440338 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:54 crc kubenswrapper[4771]: W0319 15:16:54.793772 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 19 15:16:54 crc kubenswrapper[4771]: E0319 15:16:54.793958 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 19 15:16:55 crc kubenswrapper[4771]: I0319 15:16:55.440372 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:56 crc kubenswrapper[4771]: I0319 15:16:56.441435 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 15:16:56 crc kubenswrapper[4771]: I0319 15:16:56.584483 4771 csr.go:261] certificate signing request csr-mr8r6 is approved, waiting to be issued Mar 19 15:16:56 crc kubenswrapper[4771]: I0319 15:16:56.596133 4771 csr.go:257] certificate signing request csr-mr8r6 is issued Mar 19 15:16:56 crc kubenswrapper[4771]: I0319 15:16:56.687801 4771 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 19 15:16:57 crc kubenswrapper[4771]: I0319 15:16:57.277186 4771 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 19 15:16:57 crc kubenswrapper[4771]: I0319 15:16:57.598674 4771 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-26 08:36:17.71358469 +0000 UTC Mar 19 15:16:57 crc kubenswrapper[4771]: I0319 15:16:57.598739 4771 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6761h19m20.114850785s for next certificate rotation Mar 19 15:16:58 crc kubenswrapper[4771]: I0319 15:16:58.287825 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:16:58 crc kubenswrapper[4771]: I0319 15:16:58.289651 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:58 crc kubenswrapper[4771]: I0319 15:16:58.289696 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:58 crc kubenswrapper[4771]: I0319 15:16:58.289713 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:58 crc kubenswrapper[4771]: I0319 15:16:58.289840 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 15:16:58 crc kubenswrapper[4771]: I0319 15:16:58.300363 4771 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 19 15:16:58 crc kubenswrapper[4771]: I0319 15:16:58.300684 4771 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 19 15:16:58 crc kubenswrapper[4771]: E0319 15:16:58.300719 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 19 15:16:58 crc kubenswrapper[4771]: I0319 15:16:58.305238 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:58 crc kubenswrapper[4771]: I0319 15:16:58.305344 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:58 crc kubenswrapper[4771]: I0319 15:16:58.305366 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:58 crc kubenswrapper[4771]: I0319 15:16:58.305446 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:16:58 crc kubenswrapper[4771]: I0319 15:16:58.305470 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:16:58Z","lastTransitionTime":"2026-03-19T15:16:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:16:58 crc kubenswrapper[4771]: E0319 15:16:58.325575 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:16:58 crc kubenswrapper[4771]: I0319 15:16:58.335662 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:58 crc kubenswrapper[4771]: I0319 15:16:58.335729 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:58 crc kubenswrapper[4771]: I0319 15:16:58.335752 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:58 crc kubenswrapper[4771]: I0319 15:16:58.335781 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:16:58 crc kubenswrapper[4771]: I0319 15:16:58.335799 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:16:58Z","lastTransitionTime":"2026-03-19T15:16:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:16:58 crc kubenswrapper[4771]: E0319 15:16:58.352926 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:16:58 crc kubenswrapper[4771]: I0319 15:16:58.363829 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:58 crc kubenswrapper[4771]: I0319 15:16:58.363888 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:58 crc kubenswrapper[4771]: I0319 15:16:58.363912 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:58 crc kubenswrapper[4771]: I0319 15:16:58.363942 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:16:58 crc kubenswrapper[4771]: I0319 15:16:58.363965 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:16:58Z","lastTransitionTime":"2026-03-19T15:16:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:16:58 crc kubenswrapper[4771]: E0319 15:16:58.381085 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:16:58 crc kubenswrapper[4771]: I0319 15:16:58.391145 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:16:58 crc kubenswrapper[4771]: I0319 15:16:58.391176 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:16:58 crc kubenswrapper[4771]: I0319 15:16:58.391186 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:16:58 crc kubenswrapper[4771]: I0319 15:16:58.391203 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:16:58 crc kubenswrapper[4771]: I0319 15:16:58.391216 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:16:58Z","lastTransitionTime":"2026-03-19T15:16:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:16:58 crc kubenswrapper[4771]: E0319 15:16:58.402295 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:16:58 crc kubenswrapper[4771]: E0319 15:16:58.402516 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 15:16:58 crc kubenswrapper[4771]: E0319 15:16:58.402559 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:16:58 crc kubenswrapper[4771]: E0319 15:16:58.502665 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:16:58 crc kubenswrapper[4771]: E0319 15:16:58.602879 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:16:58 crc kubenswrapper[4771]: E0319 15:16:58.703029 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:16:58 crc kubenswrapper[4771]: E0319 15:16:58.804113 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:16:58 crc kubenswrapper[4771]: E0319 15:16:58.904928 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:16:59 crc kubenswrapper[4771]: E0319 15:16:59.005689 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:16:59 crc kubenswrapper[4771]: E0319 15:16:59.105832 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:16:59 crc kubenswrapper[4771]: E0319 15:16:59.206580 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:16:59 crc kubenswrapper[4771]: E0319 15:16:59.307649 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:16:59 crc kubenswrapper[4771]: E0319 15:16:59.408493 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:16:59 crc kubenswrapper[4771]: E0319 15:16:59.509149 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:16:59 crc kubenswrapper[4771]: E0319 15:16:59.610326 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:16:59 crc kubenswrapper[4771]: E0319 15:16:59.710624 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:16:59 crc kubenswrapper[4771]: E0319 15:16:59.810720 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:16:59 crc kubenswrapper[4771]: E0319 15:16:59.910826 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:00 crc kubenswrapper[4771]: E0319 15:17:00.011325 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:00 crc kubenswrapper[4771]: E0319 15:17:00.111813 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:00 crc kubenswrapper[4771]: E0319 15:17:00.212280 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:00 crc kubenswrapper[4771]: E0319 15:17:00.312689 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:00 crc kubenswrapper[4771]: E0319 15:17:00.413797 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:00 crc kubenswrapper[4771]: I0319 15:17:00.508464 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:17:00 crc kubenswrapper[4771]: I0319 15:17:00.509808 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:00 crc kubenswrapper[4771]: I0319 15:17:00.509871 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:00 crc kubenswrapper[4771]: I0319 15:17:00.509889 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:00 crc kubenswrapper[4771]: I0319 15:17:00.510895 4771 scope.go:117] "RemoveContainer" containerID="efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6" Mar 19 15:17:00 crc kubenswrapper[4771]: E0319 15:17:00.511202 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 15:17:00 crc kubenswrapper[4771]: E0319 15:17:00.514164 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:00 crc kubenswrapper[4771]: E0319 15:17:00.614630 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:00 crc kubenswrapper[4771]: E0319 15:17:00.715690 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:00 crc kubenswrapper[4771]: E0319 15:17:00.816396 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:00 crc kubenswrapper[4771]: E0319 15:17:00.917458 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:01 crc kubenswrapper[4771]: E0319 15:17:01.017806 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:01 crc kubenswrapper[4771]: E0319 15:17:01.118148 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:01 crc kubenswrapper[4771]: E0319 15:17:01.219187 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:01 crc kubenswrapper[4771]: E0319 15:17:01.320334 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:01 crc kubenswrapper[4771]: E0319 15:17:01.421223 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:01 crc kubenswrapper[4771]: E0319 15:17:01.521837 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:01 crc kubenswrapper[4771]: E0319 15:17:01.615797 4771 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 15:17:01 crc kubenswrapper[4771]: E0319 15:17:01.622764 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:01 crc kubenswrapper[4771]: E0319 15:17:01.723465 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:01 crc kubenswrapper[4771]: E0319 15:17:01.824033 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:01 crc kubenswrapper[4771]: E0319 15:17:01.924627 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:01 crc kubenswrapper[4771]: I0319 15:17:01.998306 4771 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 15:17:02 crc kubenswrapper[4771]: E0319 15:17:02.025618 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:02 crc kubenswrapper[4771]: E0319 15:17:02.126773 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:02 crc kubenswrapper[4771]: E0319 15:17:02.227066 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:02 crc kubenswrapper[4771]: E0319 15:17:02.327263 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:02 crc kubenswrapper[4771]: E0319 15:17:02.427433 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:02 crc kubenswrapper[4771]: E0319 15:17:02.527638 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:02 crc kubenswrapper[4771]: E0319 15:17:02.628682 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:02 crc kubenswrapper[4771]: E0319 15:17:02.728880 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:02 crc kubenswrapper[4771]: E0319 15:17:02.829196 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:02 crc kubenswrapper[4771]: E0319 15:17:02.929723 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:03 crc kubenswrapper[4771]: E0319 15:17:03.030294 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:03 crc kubenswrapper[4771]: E0319 15:17:03.131405 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:03 crc kubenswrapper[4771]: E0319 15:17:03.231731 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:03 crc kubenswrapper[4771]: E0319 15:17:03.332367 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:03 crc kubenswrapper[4771]: E0319 15:17:03.433161 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:03 crc kubenswrapper[4771]: E0319 15:17:03.533555 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:03 crc kubenswrapper[4771]: E0319 15:17:03.633883 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:03 crc kubenswrapper[4771]: E0319 15:17:03.734063 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:03 crc kubenswrapper[4771]: E0319 15:17:03.834193 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:03 crc kubenswrapper[4771]: E0319 15:17:03.934796 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:04 crc kubenswrapper[4771]: E0319 15:17:04.035912 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:04 crc kubenswrapper[4771]: E0319 15:17:04.136972 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:04 crc kubenswrapper[4771]: E0319 15:17:04.237901 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:04 crc kubenswrapper[4771]: E0319 15:17:04.338034 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:04 crc kubenswrapper[4771]: E0319 15:17:04.438285 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:04 crc kubenswrapper[4771]: E0319 15:17:04.539079 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:04 crc kubenswrapper[4771]: E0319 15:17:04.640264 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:04 crc kubenswrapper[4771]: E0319 15:17:04.740674 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:04 crc kubenswrapper[4771]: E0319 15:17:04.841822 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:04 crc kubenswrapper[4771]: E0319 15:17:04.942672 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:05 crc kubenswrapper[4771]: E0319 15:17:05.043271 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:05 crc kubenswrapper[4771]: E0319 15:17:05.144457 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:05 crc kubenswrapper[4771]: E0319 15:17:05.244623 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:05 crc kubenswrapper[4771]: E0319 15:17:05.344770 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:05 crc kubenswrapper[4771]: E0319 15:17:05.445907 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:05 crc kubenswrapper[4771]: I0319 15:17:05.508565 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:17:05 crc kubenswrapper[4771]: I0319 15:17:05.510713 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:05 crc kubenswrapper[4771]: I0319 15:17:05.510787 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:05 crc kubenswrapper[4771]: I0319 15:17:05.510806 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:05 crc kubenswrapper[4771]: E0319 15:17:05.546221 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:05 crc kubenswrapper[4771]: E0319 15:17:05.646720 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:05 crc kubenswrapper[4771]: E0319 15:17:05.747482 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:05 crc kubenswrapper[4771]: E0319 15:17:05.848560 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:05 crc kubenswrapper[4771]: E0319 15:17:05.948923 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:06 crc kubenswrapper[4771]: E0319 15:17:06.049762 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:06 crc kubenswrapper[4771]: E0319 15:17:06.150874 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:06 crc kubenswrapper[4771]: E0319 15:17:06.251410 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:06 crc kubenswrapper[4771]: E0319 15:17:06.352301 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:06 crc kubenswrapper[4771]: E0319 15:17:06.452658 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:06 crc kubenswrapper[4771]: E0319 15:17:06.553233 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:06 crc kubenswrapper[4771]: E0319 15:17:06.653379 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:06 crc kubenswrapper[4771]: E0319 15:17:06.754364 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:06 crc kubenswrapper[4771]: E0319 15:17:06.854797 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:06 crc kubenswrapper[4771]: E0319 15:17:06.955891 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:07 crc kubenswrapper[4771]: E0319 15:17:07.057032 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:07 crc kubenswrapper[4771]: E0319 15:17:07.158310 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:07 crc kubenswrapper[4771]: E0319 15:17:07.259214 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:07 crc kubenswrapper[4771]: E0319 15:17:07.360269 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:07 crc kubenswrapper[4771]: E0319 15:17:07.461028 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:07 crc kubenswrapper[4771]: E0319 15:17:07.561794 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:07 crc kubenswrapper[4771]: E0319 15:17:07.663233 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:07 crc kubenswrapper[4771]: E0319 15:17:07.764193 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:07 crc kubenswrapper[4771]: E0319 15:17:07.864494 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:07 crc kubenswrapper[4771]: E0319 15:17:07.965752 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:08 crc kubenswrapper[4771]: E0319 15:17:08.066245 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:08 crc kubenswrapper[4771]: E0319 15:17:08.167570 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:08 crc kubenswrapper[4771]: E0319 15:17:08.268678 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:08 crc kubenswrapper[4771]: E0319 15:17:08.369144 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:08 crc kubenswrapper[4771]: E0319 15:17:08.469851 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:08 crc kubenswrapper[4771]: E0319 15:17:08.571016 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:08 crc kubenswrapper[4771]: E0319 15:17:08.656012 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 19 15:17:08 crc kubenswrapper[4771]: I0319 15:17:08.661234 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:08 crc kubenswrapper[4771]: I0319 15:17:08.661293 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:08 crc kubenswrapper[4771]: I0319 15:17:08.661310 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:08 crc kubenswrapper[4771]: I0319 15:17:08.661333 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:08 crc kubenswrapper[4771]: I0319 15:17:08.661368 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:08Z","lastTransitionTime":"2026-03-19T15:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:08 crc kubenswrapper[4771]: E0319 15:17:08.677701 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:08 crc kubenswrapper[4771]: I0319 15:17:08.682625 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:08 crc kubenswrapper[4771]: I0319 15:17:08.682682 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:08 crc kubenswrapper[4771]: I0319 15:17:08.682700 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:08 crc kubenswrapper[4771]: I0319 15:17:08.682822 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:08 crc kubenswrapper[4771]: I0319 15:17:08.682909 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:08Z","lastTransitionTime":"2026-03-19T15:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:08 crc kubenswrapper[4771]: E0319 15:17:08.698879 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:08 crc kubenswrapper[4771]: I0319 15:17:08.703544 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:08 crc kubenswrapper[4771]: I0319 15:17:08.703594 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:08 crc kubenswrapper[4771]: I0319 15:17:08.703612 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:08 crc kubenswrapper[4771]: I0319 15:17:08.703634 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:08 crc kubenswrapper[4771]: I0319 15:17:08.703650 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:08Z","lastTransitionTime":"2026-03-19T15:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:08 crc kubenswrapper[4771]: E0319 15:17:08.721565 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:08 crc kubenswrapper[4771]: I0319 15:17:08.726570 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:08 crc kubenswrapper[4771]: I0319 15:17:08.726828 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:08 crc kubenswrapper[4771]: I0319 15:17:08.726914 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:08 crc kubenswrapper[4771]: I0319 15:17:08.727017 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:08 crc kubenswrapper[4771]: I0319 15:17:08.727104 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:08Z","lastTransitionTime":"2026-03-19T15:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:08 crc kubenswrapper[4771]: E0319 15:17:08.742746 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:08 crc kubenswrapper[4771]: E0319 15:17:08.743037 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 15:17:08 crc kubenswrapper[4771]: E0319 15:17:08.743078 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:08 crc kubenswrapper[4771]: E0319 15:17:08.843450 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:08 crc kubenswrapper[4771]: E0319 15:17:08.944277 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:09 crc kubenswrapper[4771]: E0319 15:17:09.045357 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:09 crc kubenswrapper[4771]: E0319 15:17:09.146472 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:09 crc kubenswrapper[4771]: E0319 15:17:09.246934 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:09 crc kubenswrapper[4771]: E0319 15:17:09.347504 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:09 crc kubenswrapper[4771]: E0319 15:17:09.448033 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:09 crc kubenswrapper[4771]: E0319 15:17:09.549153 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:09 crc kubenswrapper[4771]: E0319 15:17:09.649726 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:09 crc kubenswrapper[4771]: E0319 15:17:09.751050 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:09 crc kubenswrapper[4771]: E0319 15:17:09.852225 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:09 crc kubenswrapper[4771]: E0319 15:17:09.952590 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:10 crc kubenswrapper[4771]: E0319 15:17:10.053711 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:10 crc kubenswrapper[4771]: E0319 15:17:10.154704 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:10 crc kubenswrapper[4771]: E0319 15:17:10.255727 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:10 crc kubenswrapper[4771]: E0319 15:17:10.356852 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:10 crc kubenswrapper[4771]: E0319 15:17:10.458008 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:10 crc kubenswrapper[4771]: E0319 15:17:10.558635 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:10 crc kubenswrapper[4771]: E0319 15:17:10.659310 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:10 crc kubenswrapper[4771]: E0319 15:17:10.760104 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:10 crc kubenswrapper[4771]: E0319 15:17:10.860569 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:10 crc kubenswrapper[4771]: E0319 15:17:10.961309 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:11 crc kubenswrapper[4771]: E0319 15:17:11.061806 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:11 crc kubenswrapper[4771]: E0319 15:17:11.162182 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:11 crc kubenswrapper[4771]: E0319 15:17:11.263261 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:11 crc kubenswrapper[4771]: E0319 15:17:11.364220 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:11 crc kubenswrapper[4771]: E0319 15:17:11.464715 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:11 crc kubenswrapper[4771]: E0319 15:17:11.565706 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:11 crc kubenswrapper[4771]: E0319 15:17:11.616028 4771 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 15:17:11 crc kubenswrapper[4771]: E0319 15:17:11.666242 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:11 crc kubenswrapper[4771]: E0319 15:17:11.767050 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:11 crc kubenswrapper[4771]: E0319 15:17:11.867258 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:11 crc kubenswrapper[4771]: E0319 15:17:11.967366 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:12 crc kubenswrapper[4771]: E0319 15:17:12.068127 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:12 crc kubenswrapper[4771]: E0319 15:17:12.169169 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:12 crc kubenswrapper[4771]: E0319 15:17:12.269391 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:12 crc kubenswrapper[4771]: E0319 15:17:12.370702 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:12 crc kubenswrapper[4771]: E0319 15:17:12.471423 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:12 crc kubenswrapper[4771]: E0319 15:17:12.572580 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:12 crc kubenswrapper[4771]: E0319 15:17:12.672826 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:12 crc kubenswrapper[4771]: E0319 15:17:12.773933 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:12 crc kubenswrapper[4771]: E0319 15:17:12.874457 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:12 crc kubenswrapper[4771]: E0319 15:17:12.975540 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:13 crc kubenswrapper[4771]: E0319 15:17:13.076372 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:13 crc kubenswrapper[4771]: E0319 15:17:13.177571 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:13 crc kubenswrapper[4771]: E0319 15:17:13.278647 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:13 crc kubenswrapper[4771]: E0319 15:17:13.378784 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:13 crc kubenswrapper[4771]: E0319 15:17:13.479066 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:13 crc kubenswrapper[4771]: I0319 15:17:13.508753 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 15:17:13 crc kubenswrapper[4771]: I0319 15:17:13.510372 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:13 crc kubenswrapper[4771]: I0319 15:17:13.510431 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:13 crc kubenswrapper[4771]: I0319 15:17:13.510450 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:13 crc kubenswrapper[4771]: E0319 15:17:13.579679 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:13 crc kubenswrapper[4771]: E0319 15:17:13.680652 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:13 crc kubenswrapper[4771]: E0319 15:17:13.781411 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:13 crc kubenswrapper[4771]: I0319 15:17:13.849311 4771 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 15:17:13 crc kubenswrapper[4771]: E0319 15:17:13.881816 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:13 crc kubenswrapper[4771]: E0319 15:17:13.982749 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:14 crc kubenswrapper[4771]: E0319 15:17:14.083804 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:14 crc kubenswrapper[4771]: E0319 15:17:14.184937 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:14 crc kubenswrapper[4771]: E0319 15:17:14.285546 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:14 crc kubenswrapper[4771]: E0319 15:17:14.386058 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:14 crc kubenswrapper[4771]: E0319 15:17:14.486741 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:14 crc kubenswrapper[4771]: E0319 15:17:14.587594 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:14 crc kubenswrapper[4771]: E0319 15:17:14.687763 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:14 crc kubenswrapper[4771]: E0319 15:17:14.788748 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:14 crc kubenswrapper[4771]: E0319 15:17:14.889771 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:14 crc kubenswrapper[4771]: E0319 15:17:14.990898 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.091104 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.191318 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.191935 4771 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.293580 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.293633 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.293649 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.293673 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.293691 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:15Z","lastTransitionTime":"2026-03-19T15:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.396228 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.396294 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.396311 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.396335 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.396356 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:15Z","lastTransitionTime":"2026-03-19T15:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.478620 4771 apiserver.go:52] "Watching apiserver" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.485600 4771 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.486221 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.486849 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.487080 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.487195 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.487519 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.487589 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.488410 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.488448 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.489485 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.489738 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.493898 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.494247 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.494369 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.494675 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.494683 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.496200 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.497631 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.497760 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.498175 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.503428 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.503485 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.503509 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.503540 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.503562 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:15Z","lastTransitionTime":"2026-03-19T15:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.526681 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.527481 4771 scope.go:117] "RemoveContainer" containerID="efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6" Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.528123 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.534581 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.539229 4771 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.547548 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.561631 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.571721 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.587182 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.596168 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.599497 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.599611 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.599683 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.599954 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.600151 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.600357 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.599730 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.600467 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.600493 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.600519 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.600529 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.600571 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.600595 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.600617 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.600643 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.600675 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.600707 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.600731 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.600760 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.600782 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.600801 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.600823 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.600843 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.600863 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.600882 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.600905 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.600927 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.600956 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.600978 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.601030 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.601061 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.601082 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.601103 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.601122 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.601142 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.601173 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.601195 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.601730 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.601738 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.601897 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.602036 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.602132 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.602267 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.602319 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.602473 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.602458 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.602546 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.602831 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.602930 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.603221 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.603208 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.603273 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.603437 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.603487 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.603515 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.603649 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.603814 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.603904 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.604032 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.604286 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.604503 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.604510 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.604718 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.604544 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.601216 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.606341 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.606459 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.606654 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.606699 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.606729 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.606750 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.606773 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.606794 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.606817 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.606847 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.606865 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.606885 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.607075 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.607101 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.607143 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.607330 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.607359 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.607382 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.607406 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.607447 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.607471 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.607492 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.607565 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.607589 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.607613 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.607663 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.607684 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.607712 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.607736 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.607758 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.607912 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.607937 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.607954 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.608010 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.608032 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.608060 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.608085 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.608102 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.608130 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.608154 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.608177 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.608173 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.608232 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.608260 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.608311 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.608339 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:15Z","lastTransitionTime":"2026-03-19T15:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.606952 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.607676 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.607718 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.608417 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.608463 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.608792 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.608846 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.609308 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.609411 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.609586 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.608195 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.609929 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610039 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610078 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610125 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610168 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610203 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610245 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610290 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610325 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610371 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610411 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610453 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610488 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610532 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610576 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610617 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610662 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610707 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610750 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610785 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610825 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610813 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610868 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.611273 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.609807 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.611358 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.611433 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.611495 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.611593 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.609846 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.612208 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.612506 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.611663 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.613627 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.613689 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.613742 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.613779 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.613820 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.613870 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.613905 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.613947 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610012 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.614022 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610039 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.614069 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.614107 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.614152 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.614198 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610275 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.615072 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610326 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.614249 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.615378 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610351 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610330 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610482 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610679 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610771 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.610837 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.611056 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.611214 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.611300 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.611309 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.611554 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.614379 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.614645 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.614651 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.614720 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.615307 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.615596 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.615954 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.616385 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.615965 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.616800 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.617048 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.616680 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.616627 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.616960 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.617377 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.615476 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.617439 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.617858 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.617909 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.617941 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.617968 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618016 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618045 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618072 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618095 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618123 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618152 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618176 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618233 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618262 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618291 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618314 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618340 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618370 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618393 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618421 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618492 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618519 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618542 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618568 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618593 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618615 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618643 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618671 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618701 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618727 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618754 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618779 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618801 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618828 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618853 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618875 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618902 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.618960 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.619002 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.619031 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.619056 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.619084 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.619109 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.619137 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.619167 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.619195 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.619221 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.619247 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.619271 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.619313 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.619342 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.619372 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.619402 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.619440 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.619467 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.619491 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.619501 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.619517 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.619622 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.619672 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.619722 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.619940 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.620032 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.620047 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.620108 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.620137 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.620163 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.620204 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.620250 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.620365 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.620409 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.620460 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.620507 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.620535 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.620552 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.620607 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.620670 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.620693 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.620743 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.620787 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.620801 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.620836 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.620931 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.621068 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.621156 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.621224 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.621293 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.621335 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.621349 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.621357 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.621494 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.621499 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.621719 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.621884 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.621956 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.622048 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.622065 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.622118 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.622168 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.622173 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.622213 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.622259 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.622300 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.622342 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.622391 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.622437 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.622489 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.622519 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.622532 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.622688 4771 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.622699 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.622716 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.622769 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.622800 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.622837 4771 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.622947 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.622970 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623038 4771 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623067 4771 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623091 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623114 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623142 4771 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623186 4771 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623212 4771 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623236 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623265 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623294 4771 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623317 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623340 4771 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623375 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623402 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623431 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623461 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623485 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623508 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623531 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623559 4771 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623584 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623607 4771 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623631 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623660 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623684 4771 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623707 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623730 4771 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623761 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623783 4771 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623806 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623833 4771 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623855 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623876 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623898 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623926 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623948 4771 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623970 4771 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.624028 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.624055 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.624076 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.624099 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.624128 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.624149 4771 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.624170 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.624191 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.624217 4771 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623328 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.623880 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.624150 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.624539 4771 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.624579 4771 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.624614 4771 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.624637 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.624660 4771 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.624682 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.624708 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.624730 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.624751 4771 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.624771 4771 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.624811 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.624830 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.624851 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.624878 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.624903 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.624924 4771 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.624944 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.625031 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.625055 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.625076 4771 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.625096 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.627161 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.627227 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.627275 4771 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.627307 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.627337 4771 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.627367 4771 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.627406 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.627434 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.627466 4771 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.627497 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.627590 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.627671 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.627702 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.627740 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.627769 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.627797 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.625301 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.625321 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.625655 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.626067 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.626132 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.626411 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.626514 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.626532 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.626893 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.627162 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.627438 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.627773 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.628589 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.629269 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.629269 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.631123 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.631255 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.631569 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.631723 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.631842 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.632113 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.632374 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.632723 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.632845 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.633205 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.633700 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.634356 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.634823 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.634854 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.635197 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.635475 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.635474 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.635797 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.636047 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.631684 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.636867 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.637162 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 15:17:16.137131763 +0000 UTC m=+95.365753005 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.637328 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:17:16.137314477 +0000 UTC m=+95.365935689 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.637467 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.637498 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.638117 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.633353 4771 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.638360 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.638547 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.638927 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.635230 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.635222 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.639404 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.635561 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.639449 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.639593 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.640223 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.640927 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.641149 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.642082 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.642104 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.642297 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.643004 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.644032 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.646032 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.646329 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.646382 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.646555 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.646827 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.640552 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.640654 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.641765 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.647395 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.647805 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.647901 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 15:17:16.147507741 +0000 UTC m=+95.376128953 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.647937 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.648207 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.649199 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.649390 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.649415 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.650059 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.650131 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.650258 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.650608 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.650615 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.651371 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.651671 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.651847 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.652132 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.652155 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.652811 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.653086 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.652411 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.652864 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.653744 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.664164 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.665084 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.665764 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.666144 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.666271 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.666469 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 15:17:16.166436727 +0000 UTC m=+95.395057959 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.666784 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.666944 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.667146 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.667403 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 15:17:16.167371162 +0000 UTC m=+95.395992484 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.667414 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.670641 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.673104 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.673762 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.678829 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.679047 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.679147 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.679329 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.679744 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.680021 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.680249 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.680366 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.680786 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.681215 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.683956 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.693362 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.704315 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.705684 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.710813 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.710845 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.710857 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.710875 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.710886 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:15Z","lastTransitionTime":"2026-03-19T15:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.728389 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.728535 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.728980 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.729301 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.729472 4771 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.729615 4771 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.729747 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.729889 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.730053 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.730177 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.730319 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.730466 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.730601 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.730775 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.730955 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.731143 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.731298 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.731425 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.731628 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.731786 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.731918 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.732286 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.732437 4771 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.732575 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.732702 4771 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.732838 4771 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.732969 4771 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.733194 4771 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.733355 4771 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.733515 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.733645 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.733786 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.733921 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.734101 4771 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.734250 4771 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.734389 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.734565 4771 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.734721 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.734929 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.735143 4771 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.735306 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.735452 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.735589 4771 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.735725 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.735885 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.736070 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.736199 4771 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.736367 4771 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.736521 4771 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.736647 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.736784 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.736917 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.737077 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.737220 4771 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.737351 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.737522 4771 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.737662 4771 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.737797 4771 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.737919 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.738072 4771 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.738203 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.738398 4771 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.738585 4771 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.738727 4771 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.738856 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.739023 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.739282 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.739414 4771 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.739552 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.739687 4771 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.739850 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.740086 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.740251 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.740391 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.740519 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.740641 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.740777 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.740905 4771 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.741074 4771 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.741245 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.741398 4771 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.741531 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.741682 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.741827 4771 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.741962 4771 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.742155 4771 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.742298 4771 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.742448 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.742587 4771 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.742733 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.742925 4771 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.743192 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.743351 4771 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.743495 4771 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.743678 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.743880 4771 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.744075 4771 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.744256 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.744444 4771 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.744633 4771 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.744793 4771 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.744934 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.745113 4771 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.745254 4771 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.745375 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.745500 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.745638 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.729105 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.814246 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.814319 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.814337 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.814362 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.814381 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:15Z","lastTransitionTime":"2026-03-19T15:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.815129 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.832434 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.839746 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 15:17:15 crc kubenswrapper[4771]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 19 15:17:15 crc kubenswrapper[4771]: set -o allexport Mar 19 15:17:15 crc kubenswrapper[4771]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 19 15:17:15 crc kubenswrapper[4771]: source /etc/kubernetes/apiserver-url.env Mar 19 15:17:15 crc kubenswrapper[4771]: else Mar 19 15:17:15 crc kubenswrapper[4771]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 19 15:17:15 crc kubenswrapper[4771]: exit 1 Mar 19 15:17:15 crc kubenswrapper[4771]: fi Mar 19 15:17:15 crc kubenswrapper[4771]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 19 15:17:15 crc kubenswrapper[4771]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 15:17:15 crc kubenswrapper[4771]: > logger="UnhandledError" Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.842156 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.846789 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.854481 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 15:17:15 crc kubenswrapper[4771]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 19 15:17:15 crc kubenswrapper[4771]: if [[ -f "/env/_master" ]]; then Mar 19 15:17:15 crc kubenswrapper[4771]: set -o allexport Mar 19 15:17:15 crc kubenswrapper[4771]: source "/env/_master" Mar 19 15:17:15 crc kubenswrapper[4771]: set +o allexport Mar 19 15:17:15 crc kubenswrapper[4771]: fi Mar 19 15:17:15 crc kubenswrapper[4771]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 19 15:17:15 crc kubenswrapper[4771]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 19 15:17:15 crc kubenswrapper[4771]: ho_enable="--enable-hybrid-overlay" Mar 19 15:17:15 crc kubenswrapper[4771]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 19 15:17:15 crc kubenswrapper[4771]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 19 15:17:15 crc kubenswrapper[4771]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 19 15:17:15 crc kubenswrapper[4771]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 19 15:17:15 crc kubenswrapper[4771]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 19 15:17:15 crc kubenswrapper[4771]: --webhook-host=127.0.0.1 \ Mar 19 15:17:15 crc kubenswrapper[4771]: --webhook-port=9743 \ Mar 19 15:17:15 crc kubenswrapper[4771]: ${ho_enable} \ Mar 19 15:17:15 crc kubenswrapper[4771]: --enable-interconnect \ Mar 19 15:17:15 crc kubenswrapper[4771]: --disable-approver \ Mar 19 15:17:15 crc kubenswrapper[4771]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 19 15:17:15 crc kubenswrapper[4771]: --wait-for-kubernetes-api=200s \ Mar 19 15:17:15 crc kubenswrapper[4771]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 19 15:17:15 crc kubenswrapper[4771]: --loglevel="${LOGLEVEL}" Mar 19 15:17:15 crc kubenswrapper[4771]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 15:17:15 crc kubenswrapper[4771]: > logger="UnhandledError" Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.859256 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 15:17:15 crc kubenswrapper[4771]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 19 15:17:15 crc kubenswrapper[4771]: if [[ -f "/env/_master" ]]; then Mar 19 15:17:15 crc kubenswrapper[4771]: set -o allexport Mar 19 15:17:15 crc kubenswrapper[4771]: source "/env/_master" Mar 19 15:17:15 crc kubenswrapper[4771]: set +o allexport Mar 19 15:17:15 crc kubenswrapper[4771]: fi Mar 19 15:17:15 crc kubenswrapper[4771]: Mar 19 15:17:15 crc kubenswrapper[4771]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 19 15:17:15 crc kubenswrapper[4771]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 19 15:17:15 crc kubenswrapper[4771]: --disable-webhook \ Mar 19 15:17:15 crc kubenswrapper[4771]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 19 15:17:15 crc kubenswrapper[4771]: --loglevel="${LOGLEVEL}" Mar 19 15:17:15 crc kubenswrapper[4771]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 15:17:15 crc kubenswrapper[4771]: > logger="UnhandledError" Mar 19 15:17:15 crc kubenswrapper[4771]: W0319 15:17:15.860274 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-578ade6040767ac66e6cfdc001f6d12db61e5f378797b089647f7e1928daaf66 WatchSource:0}: Error finding container 578ade6040767ac66e6cfdc001f6d12db61e5f378797b089647f7e1928daaf66: Status 404 returned error can't find the container with id 578ade6040767ac66e6cfdc001f6d12db61e5f378797b089647f7e1928daaf66 Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.860458 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.864296 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.865891 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.900013 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"578ade6040767ac66e6cfdc001f6d12db61e5f378797b089647f7e1928daaf66"} Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.901420 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"82ad6da15ec609f8a69cd4701bd3d77548719b100ef0f450d8409ac5a280c162"} Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.903026 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.903388 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"3370828aa2ff7c704950af3e2cd31cd55ef47010eec4002bc41a26d5b974f632"} Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.903875 4771 scope.go:117] "RemoveContainer" containerID="efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6" Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.904046 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.904099 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.905557 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 15:17:15 crc kubenswrapper[4771]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 19 15:17:15 crc kubenswrapper[4771]: if [[ -f "/env/_master" ]]; then Mar 19 15:17:15 crc kubenswrapper[4771]: set -o allexport Mar 19 15:17:15 crc kubenswrapper[4771]: source "/env/_master" Mar 19 15:17:15 crc kubenswrapper[4771]: set +o allexport Mar 19 15:17:15 crc kubenswrapper[4771]: fi Mar 19 15:17:15 crc kubenswrapper[4771]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 19 15:17:15 crc kubenswrapper[4771]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 19 15:17:15 crc kubenswrapper[4771]: ho_enable="--enable-hybrid-overlay" Mar 19 15:17:15 crc kubenswrapper[4771]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 19 15:17:15 crc kubenswrapper[4771]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 19 15:17:15 crc kubenswrapper[4771]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 19 15:17:15 crc kubenswrapper[4771]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 19 15:17:15 crc kubenswrapper[4771]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 19 15:17:15 crc kubenswrapper[4771]: --webhook-host=127.0.0.1 \ Mar 19 15:17:15 crc kubenswrapper[4771]: --webhook-port=9743 \ Mar 19 15:17:15 crc kubenswrapper[4771]: ${ho_enable} \ Mar 19 15:17:15 crc kubenswrapper[4771]: --enable-interconnect \ Mar 19 15:17:15 crc kubenswrapper[4771]: --disable-approver \ Mar 19 15:17:15 crc kubenswrapper[4771]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 19 15:17:15 crc kubenswrapper[4771]: --wait-for-kubernetes-api=200s \ Mar 19 15:17:15 crc kubenswrapper[4771]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 19 15:17:15 crc kubenswrapper[4771]: --loglevel="${LOGLEVEL}" Mar 19 15:17:15 crc kubenswrapper[4771]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 15:17:15 crc kubenswrapper[4771]: > logger="UnhandledError" Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.905772 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 15:17:15 crc kubenswrapper[4771]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 19 15:17:15 crc kubenswrapper[4771]: set -o allexport Mar 19 15:17:15 crc kubenswrapper[4771]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 19 15:17:15 crc kubenswrapper[4771]: source /etc/kubernetes/apiserver-url.env Mar 19 15:17:15 crc kubenswrapper[4771]: else Mar 19 15:17:15 crc kubenswrapper[4771]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 19 15:17:15 crc kubenswrapper[4771]: exit 1 Mar 19 15:17:15 crc kubenswrapper[4771]: fi Mar 19 15:17:15 crc kubenswrapper[4771]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 19 15:17:15 crc kubenswrapper[4771]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 15:17:15 crc kubenswrapper[4771]: > logger="UnhandledError" Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.906916 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.908144 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 15:17:15 crc kubenswrapper[4771]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 19 15:17:15 crc kubenswrapper[4771]: if [[ -f "/env/_master" ]]; then Mar 19 15:17:15 crc kubenswrapper[4771]: set -o allexport Mar 19 15:17:15 crc kubenswrapper[4771]: source "/env/_master" Mar 19 15:17:15 crc kubenswrapper[4771]: set +o allexport Mar 19 15:17:15 crc kubenswrapper[4771]: fi Mar 19 15:17:15 crc kubenswrapper[4771]: Mar 19 15:17:15 crc kubenswrapper[4771]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 19 15:17:15 crc kubenswrapper[4771]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 19 15:17:15 crc kubenswrapper[4771]: --disable-webhook \ Mar 19 15:17:15 crc kubenswrapper[4771]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 19 15:17:15 crc kubenswrapper[4771]: --loglevel="${LOGLEVEL}" Mar 19 15:17:15 crc kubenswrapper[4771]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 15:17:15 crc kubenswrapper[4771]: > logger="UnhandledError" Mar 19 15:17:15 crc kubenswrapper[4771]: E0319 15:17:15.909825 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.917247 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.918293 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.918358 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.918416 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.918450 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.918479 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:15Z","lastTransitionTime":"2026-03-19T15:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.932055 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.945911 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.960918 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.978334 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:15 crc kubenswrapper[4771]: I0319 15:17:15.993025 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.007603 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.021580 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.021655 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.021673 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.021699 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.021718 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:16Z","lastTransitionTime":"2026-03-19T15:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.025510 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.040611 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.054810 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.068970 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.087185 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.101370 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.113848 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.124914 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.125038 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.125104 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.125132 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.125149 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:16Z","lastTransitionTime":"2026-03-19T15:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.152312 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.152400 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:16 crc kubenswrapper[4771]: E0319 15:17:16.152471 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:17:17.15245228 +0000 UTC m=+96.381073492 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.152500 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:16 crc kubenswrapper[4771]: E0319 15:17:16.152529 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 15:17:16 crc kubenswrapper[4771]: E0319 15:17:16.152588 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 15:17:16 crc kubenswrapper[4771]: E0319 15:17:16.152601 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 15:17:17.152583103 +0000 UTC m=+96.381204345 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 15:17:16 crc kubenswrapper[4771]: E0319 15:17:16.152620 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 15:17:17.152611694 +0000 UTC m=+96.381232906 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.227270 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.227332 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.227350 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.227375 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.227392 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:16Z","lastTransitionTime":"2026-03-19T15:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.253292 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.253353 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:16 crc kubenswrapper[4771]: E0319 15:17:16.253482 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 15:17:16 crc kubenswrapper[4771]: E0319 15:17:16.253500 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 15:17:16 crc kubenswrapper[4771]: E0319 15:17:16.253511 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:17:16 crc kubenswrapper[4771]: E0319 15:17:16.253556 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 15:17:16 crc kubenswrapper[4771]: E0319 15:17:16.253593 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 15:17:16 crc kubenswrapper[4771]: E0319 15:17:16.253614 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:17:16 crc kubenswrapper[4771]: E0319 15:17:16.253566 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 15:17:17.253552445 +0000 UTC m=+96.482173647 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:17:16 crc kubenswrapper[4771]: E0319 15:17:16.253698 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 15:17:17.253675318 +0000 UTC m=+96.482296550 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.330576 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.330644 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.330668 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.330699 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.330722 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:16Z","lastTransitionTime":"2026-03-19T15:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.433376 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.433433 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.433450 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.433473 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.433489 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:16Z","lastTransitionTime":"2026-03-19T15:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.508175 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:16 crc kubenswrapper[4771]: E0319 15:17:16.508389 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.536420 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.536476 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.536501 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.536530 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.536549 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:16Z","lastTransitionTime":"2026-03-19T15:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.638706 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.638744 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.638757 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.638777 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.638791 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:16Z","lastTransitionTime":"2026-03-19T15:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.741267 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.741317 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.741336 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.741359 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.741374 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:16Z","lastTransitionTime":"2026-03-19T15:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.843811 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.843874 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.843890 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.843925 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.843942 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:16Z","lastTransitionTime":"2026-03-19T15:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.947189 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.947269 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.947296 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.947328 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:16 crc kubenswrapper[4771]: I0319 15:17:16.947350 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:16Z","lastTransitionTime":"2026-03-19T15:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.051533 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.051593 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.051611 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.051647 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.051665 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:17Z","lastTransitionTime":"2026-03-19T15:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.155414 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.155453 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.155462 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.155478 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.155488 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:17Z","lastTransitionTime":"2026-03-19T15:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.160086 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.160134 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.160160 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:17 crc kubenswrapper[4771]: E0319 15:17:17.160265 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 15:17:17 crc kubenswrapper[4771]: E0319 15:17:17.160328 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:17:19.160291533 +0000 UTC m=+98.388912775 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:17:17 crc kubenswrapper[4771]: E0319 15:17:17.160356 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 15:17:17 crc kubenswrapper[4771]: E0319 15:17:17.160421 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 15:17:19.160377136 +0000 UTC m=+98.388998488 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 15:17:17 crc kubenswrapper[4771]: E0319 15:17:17.160503 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 15:17:19.160468288 +0000 UTC m=+98.389089540 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.257643 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.257667 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.257675 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.257688 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.257696 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:17Z","lastTransitionTime":"2026-03-19T15:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.263428 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.263654 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:17 crc kubenswrapper[4771]: E0319 15:17:17.263713 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 15:17:17 crc kubenswrapper[4771]: E0319 15:17:17.263775 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 15:17:17 crc kubenswrapper[4771]: E0319 15:17:17.263787 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 15:17:17 crc kubenswrapper[4771]: E0319 15:17:17.263886 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:17:17 crc kubenswrapper[4771]: E0319 15:17:17.263909 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 15:17:17 crc kubenswrapper[4771]: E0319 15:17:17.264839 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:17:17 crc kubenswrapper[4771]: E0319 15:17:17.264906 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 15:17:19.264885612 +0000 UTC m=+98.493506814 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:17:17 crc kubenswrapper[4771]: E0319 15:17:17.265036 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 15:17:19.265029016 +0000 UTC m=+98.493650218 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.361284 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.361364 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.361384 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.361415 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.361448 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:17Z","lastTransitionTime":"2026-03-19T15:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.463924 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.464038 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.464408 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.464482 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.465373 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:17Z","lastTransitionTime":"2026-03-19T15:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.508648 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:17 crc kubenswrapper[4771]: E0319 15:17:17.508801 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.509261 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:17 crc kubenswrapper[4771]: E0319 15:17:17.509511 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.513485 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.514065 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.515250 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.515852 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.516901 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.517528 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.518219 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.519224 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.520114 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.521070 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.521560 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.522597 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.523328 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.523822 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.524758 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.525315 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.526222 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.526580 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.527154 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.528160 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.528607 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.529497 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.529888 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.530848 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.531301 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.531901 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.532947 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.533403 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.534431 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.534958 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.536046 4771 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.536167 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.538064 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.538965 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.539417 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.540889 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.541641 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.543379 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.545388 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.547750 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.548786 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.551021 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.552665 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.554937 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.556029 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.558120 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.559372 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.562024 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.563334 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.565250 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.568514 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.569040 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.569100 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.569119 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.569142 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.569159 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:17Z","lastTransitionTime":"2026-03-19T15:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.569689 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.570546 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.571587 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.671931 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.671972 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.671999 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.672019 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.672032 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:17Z","lastTransitionTime":"2026-03-19T15:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.774634 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.774700 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.774719 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.774743 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.774762 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:17Z","lastTransitionTime":"2026-03-19T15:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.878100 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.878233 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.878268 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.878343 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.878366 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:17Z","lastTransitionTime":"2026-03-19T15:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.981728 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.981884 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.981907 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.981932 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:17 crc kubenswrapper[4771]: I0319 15:17:17.981949 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:17Z","lastTransitionTime":"2026-03-19T15:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.085294 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.085360 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.085380 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.085414 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.085441 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:18Z","lastTransitionTime":"2026-03-19T15:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.188851 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.188895 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.188912 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.188932 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.188945 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:18Z","lastTransitionTime":"2026-03-19T15:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.291465 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.291508 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.291520 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.291537 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.291548 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:18Z","lastTransitionTime":"2026-03-19T15:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.393045 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.393078 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.393090 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.393104 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.393114 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:18Z","lastTransitionTime":"2026-03-19T15:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.495091 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.495452 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.495629 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.495806 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.496014 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:18Z","lastTransitionTime":"2026-03-19T15:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.508649 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:18 crc kubenswrapper[4771]: E0319 15:17:18.508913 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.599865 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.599928 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.599946 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.600022 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.600047 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:18Z","lastTransitionTime":"2026-03-19T15:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.702628 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.702898 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.702968 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.703059 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.703119 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:18Z","lastTransitionTime":"2026-03-19T15:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.806152 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.806389 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.806490 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.806570 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.806644 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:18Z","lastTransitionTime":"2026-03-19T15:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.909924 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.909960 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.909974 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.910012 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:18 crc kubenswrapper[4771]: I0319 15:17:18.910062 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:18Z","lastTransitionTime":"2026-03-19T15:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.005441 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.005497 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.005507 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.005526 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.005536 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:19Z","lastTransitionTime":"2026-03-19T15:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:19 crc kubenswrapper[4771]: E0319 15:17:19.019705 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.024193 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.024401 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.024542 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.024694 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.024827 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:19Z","lastTransitionTime":"2026-03-19T15:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:19 crc kubenswrapper[4771]: E0319 15:17:19.040190 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.051392 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.052478 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.052501 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.052519 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.052535 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:19Z","lastTransitionTime":"2026-03-19T15:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:19 crc kubenswrapper[4771]: E0319 15:17:19.061698 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.066260 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.066305 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.066315 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.066330 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.066341 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:19Z","lastTransitionTime":"2026-03-19T15:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:19 crc kubenswrapper[4771]: E0319 15:17:19.074711 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.078769 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.078834 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.078855 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.078883 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.078907 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:19Z","lastTransitionTime":"2026-03-19T15:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:19 crc kubenswrapper[4771]: E0319 15:17:19.091192 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:19 crc kubenswrapper[4771]: E0319 15:17:19.091415 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.093163 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.093305 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.093404 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.093823 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.093954 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:19Z","lastTransitionTime":"2026-03-19T15:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.180090 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.180206 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.180250 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:19 crc kubenswrapper[4771]: E0319 15:17:19.180376 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 15:17:19 crc kubenswrapper[4771]: E0319 15:17:19.180400 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 15:17:19 crc kubenswrapper[4771]: E0319 15:17:19.180447 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 15:17:23.180426 +0000 UTC m=+102.409047222 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 15:17:19 crc kubenswrapper[4771]: E0319 15:17:19.180489 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 15:17:23.180461361 +0000 UTC m=+102.409082603 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 15:17:19 crc kubenswrapper[4771]: E0319 15:17:19.180577 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:17:23.180563474 +0000 UTC m=+102.409184706 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.201855 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.201935 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.201961 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.202029 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.202055 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:19Z","lastTransitionTime":"2026-03-19T15:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.281523 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.281578 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:19 crc kubenswrapper[4771]: E0319 15:17:19.281831 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 15:17:19 crc kubenswrapper[4771]: E0319 15:17:19.281840 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 15:17:19 crc kubenswrapper[4771]: E0319 15:17:19.281899 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 15:17:19 crc kubenswrapper[4771]: E0319 15:17:19.281921 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:17:19 crc kubenswrapper[4771]: E0319 15:17:19.281860 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 15:17:19 crc kubenswrapper[4771]: E0319 15:17:19.282023 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 15:17:23.281973368 +0000 UTC m=+102.510594630 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:17:19 crc kubenswrapper[4771]: E0319 15:17:19.282039 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:17:19 crc kubenswrapper[4771]: E0319 15:17:19.282291 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 15:17:23.282270155 +0000 UTC m=+102.510891387 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.304711 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.304753 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.304766 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.304782 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.304794 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:19Z","lastTransitionTime":"2026-03-19T15:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.407523 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.407613 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.407638 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.407672 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.407695 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:19Z","lastTransitionTime":"2026-03-19T15:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.508526 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:19 crc kubenswrapper[4771]: E0319 15:17:19.508711 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.508528 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:19 crc kubenswrapper[4771]: E0319 15:17:19.508856 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.510527 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.510554 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.510566 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.510579 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.510587 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:19Z","lastTransitionTime":"2026-03-19T15:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.612611 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.612677 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.612686 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.612699 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.612712 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:19Z","lastTransitionTime":"2026-03-19T15:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.715280 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.715357 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.715384 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.715415 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.715440 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:19Z","lastTransitionTime":"2026-03-19T15:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.817885 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.817958 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.818019 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.818055 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.818085 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:19Z","lastTransitionTime":"2026-03-19T15:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.921023 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.921130 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.921171 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.921214 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:19 crc kubenswrapper[4771]: I0319 15:17:19.921236 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:19Z","lastTransitionTime":"2026-03-19T15:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.023720 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.023777 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.023798 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.023827 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.023847 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:20Z","lastTransitionTime":"2026-03-19T15:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.127561 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.127687 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.127746 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.127776 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.127795 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:20Z","lastTransitionTime":"2026-03-19T15:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.231052 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.231133 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.231161 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.231187 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.231204 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:20Z","lastTransitionTime":"2026-03-19T15:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.334574 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.334626 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.334639 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.334656 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.334669 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:20Z","lastTransitionTime":"2026-03-19T15:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.437758 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.437838 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.437854 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.437878 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.437895 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:20Z","lastTransitionTime":"2026-03-19T15:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.508423 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:20 crc kubenswrapper[4771]: E0319 15:17:20.508535 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.539792 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.540140 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.540187 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.540215 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.540234 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:20Z","lastTransitionTime":"2026-03-19T15:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.643151 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.643188 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.643198 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.643215 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.643232 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:20Z","lastTransitionTime":"2026-03-19T15:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.746280 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.746339 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.746353 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.746373 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.746388 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:20Z","lastTransitionTime":"2026-03-19T15:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.848625 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.848695 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.848712 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.848739 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.848757 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:20Z","lastTransitionTime":"2026-03-19T15:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.951161 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.951193 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.951202 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.951215 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:20 crc kubenswrapper[4771]: I0319 15:17:20.951226 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:20Z","lastTransitionTime":"2026-03-19T15:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.054396 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.054448 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.054459 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.054474 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.054486 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:21Z","lastTransitionTime":"2026-03-19T15:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.158063 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.158171 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.158237 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.158267 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.158287 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:21Z","lastTransitionTime":"2026-03-19T15:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.261342 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.261570 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.261595 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.261627 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.261651 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:21Z","lastTransitionTime":"2026-03-19T15:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.364511 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.364922 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.365216 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.365415 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.365550 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:21Z","lastTransitionTime":"2026-03-19T15:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.469324 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.469382 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.469401 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.469425 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.469444 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:21Z","lastTransitionTime":"2026-03-19T15:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.508089 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:21 crc kubenswrapper[4771]: E0319 15:17:21.508257 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.508432 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:21 crc kubenswrapper[4771]: E0319 15:17:21.508612 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.521536 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.538339 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.551302 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.566332 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.572374 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.572618 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.572755 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.572898 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.573220 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:21Z","lastTransitionTime":"2026-03-19T15:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.578739 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.591623 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.603201 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.676216 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.676266 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.676283 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.676307 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.676324 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:21Z","lastTransitionTime":"2026-03-19T15:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.779585 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.780053 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.780198 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.780353 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.780699 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:21Z","lastTransitionTime":"2026-03-19T15:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.884032 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.884108 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.884126 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.884150 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.884168 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:21Z","lastTransitionTime":"2026-03-19T15:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.985952 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.986003 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.986013 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.986025 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:21 crc kubenswrapper[4771]: I0319 15:17:21.986034 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:21Z","lastTransitionTime":"2026-03-19T15:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.088568 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.088633 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.088650 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.088673 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.088693 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:22Z","lastTransitionTime":"2026-03-19T15:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.191366 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.191418 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.191434 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.191457 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.191474 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:22Z","lastTransitionTime":"2026-03-19T15:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.293908 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.293956 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.293967 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.293997 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.294010 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:22Z","lastTransitionTime":"2026-03-19T15:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.328950 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-hg7b2"] Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.329275 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hg7b2" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.332554 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.332969 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.333025 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.345589 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.363348 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.373348 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.388449 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.396136 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.396190 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.396207 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.396231 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.396250 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:22Z","lastTransitionTime":"2026-03-19T15:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.405057 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.407890 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc-hosts-file\") pod \"node-resolver-hg7b2\" (UID: \"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\") " pod="openshift-dns/node-resolver-hg7b2" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.407945 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzjlt\" (UniqueName: \"kubernetes.io/projected/ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc-kube-api-access-jzjlt\") pod \"node-resolver-hg7b2\" (UID: \"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\") " pod="openshift-dns/node-resolver-hg7b2" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.418593 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.427714 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.434544 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.499187 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.499292 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.499313 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.499346 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.499367 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:22Z","lastTransitionTime":"2026-03-19T15:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.507866 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:22 crc kubenswrapper[4771]: E0319 15:17:22.508118 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.508663 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzjlt\" (UniqueName: \"kubernetes.io/projected/ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc-kube-api-access-jzjlt\") pod \"node-resolver-hg7b2\" (UID: \"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\") " pod="openshift-dns/node-resolver-hg7b2" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.509041 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc-hosts-file\") pod \"node-resolver-hg7b2\" (UID: \"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\") " pod="openshift-dns/node-resolver-hg7b2" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.509186 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc-hosts-file\") pod \"node-resolver-hg7b2\" (UID: \"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\") " pod="openshift-dns/node-resolver-hg7b2" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.525710 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzjlt\" (UniqueName: \"kubernetes.io/projected/ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc-kube-api-access-jzjlt\") pod \"node-resolver-hg7b2\" (UID: \"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\") " pod="openshift-dns/node-resolver-hg7b2" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.603200 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.603278 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.603295 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.603320 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.603337 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:22Z","lastTransitionTime":"2026-03-19T15:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.655862 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hg7b2" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.684951 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-wqbzp"] Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.685467 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-nmdkf"] Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.685644 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.688845 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.688876 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.689625 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-9989m"] Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.690411 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.690617 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.691248 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.692562 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.692681 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 15:17:22 crc kubenswrapper[4771]: E0319 15:17:22.692757 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 15:17:22 crc kubenswrapper[4771]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 19 15:17:22 crc kubenswrapper[4771]: set -uo pipefail Mar 19 15:17:22 crc kubenswrapper[4771]: Mar 19 15:17:22 crc kubenswrapper[4771]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 19 15:17:22 crc kubenswrapper[4771]: Mar 19 15:17:22 crc kubenswrapper[4771]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 19 15:17:22 crc kubenswrapper[4771]: HOSTS_FILE="/etc/hosts" Mar 19 15:17:22 crc kubenswrapper[4771]: TEMP_FILE="/etc/hosts.tmp" Mar 19 15:17:22 crc kubenswrapper[4771]: Mar 19 15:17:22 crc kubenswrapper[4771]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 19 15:17:22 crc kubenswrapper[4771]: Mar 19 15:17:22 crc kubenswrapper[4771]: # Make a temporary file with the old hosts file's attributes. Mar 19 15:17:22 crc kubenswrapper[4771]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 19 15:17:22 crc kubenswrapper[4771]: echo "Failed to preserve hosts file. Exiting." Mar 19 15:17:22 crc kubenswrapper[4771]: exit 1 Mar 19 15:17:22 crc kubenswrapper[4771]: fi Mar 19 15:17:22 crc kubenswrapper[4771]: Mar 19 15:17:22 crc kubenswrapper[4771]: while true; do Mar 19 15:17:22 crc kubenswrapper[4771]: declare -A svc_ips Mar 19 15:17:22 crc kubenswrapper[4771]: for svc in "${services[@]}"; do Mar 19 15:17:22 crc kubenswrapper[4771]: # Fetch service IP from cluster dns if present. We make several tries Mar 19 15:17:22 crc kubenswrapper[4771]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 19 15:17:22 crc kubenswrapper[4771]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 19 15:17:22 crc kubenswrapper[4771]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 19 15:17:22 crc kubenswrapper[4771]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 19 15:17:22 crc kubenswrapper[4771]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 19 15:17:22 crc kubenswrapper[4771]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 19 15:17:22 crc kubenswrapper[4771]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 19 15:17:22 crc kubenswrapper[4771]: for i in ${!cmds[*]} Mar 19 15:17:22 crc kubenswrapper[4771]: do Mar 19 15:17:22 crc kubenswrapper[4771]: ips=($(eval "${cmds[i]}")) Mar 19 15:17:22 crc kubenswrapper[4771]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 19 15:17:22 crc kubenswrapper[4771]: svc_ips["${svc}"]="${ips[@]}" Mar 19 15:17:22 crc kubenswrapper[4771]: break Mar 19 15:17:22 crc kubenswrapper[4771]: fi Mar 19 15:17:22 crc kubenswrapper[4771]: done Mar 19 15:17:22 crc kubenswrapper[4771]: done Mar 19 15:17:22 crc kubenswrapper[4771]: Mar 19 15:17:22 crc kubenswrapper[4771]: # Update /etc/hosts only if we get valid service IPs Mar 19 15:17:22 crc kubenswrapper[4771]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 19 15:17:22 crc kubenswrapper[4771]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 19 15:17:22 crc kubenswrapper[4771]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 19 15:17:22 crc kubenswrapper[4771]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 19 15:17:22 crc kubenswrapper[4771]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 19 15:17:22 crc kubenswrapper[4771]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 19 15:17:22 crc kubenswrapper[4771]: sleep 60 & wait Mar 19 15:17:22 crc kubenswrapper[4771]: continue Mar 19 15:17:22 crc kubenswrapper[4771]: fi Mar 19 15:17:22 crc kubenswrapper[4771]: Mar 19 15:17:22 crc kubenswrapper[4771]: # Append resolver entries for services Mar 19 15:17:22 crc kubenswrapper[4771]: rc=0 Mar 19 15:17:22 crc kubenswrapper[4771]: for svc in "${!svc_ips[@]}"; do Mar 19 15:17:22 crc kubenswrapper[4771]: for ip in ${svc_ips[${svc}]}; do Mar 19 15:17:22 crc kubenswrapper[4771]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 19 15:17:22 crc kubenswrapper[4771]: done Mar 19 15:17:22 crc kubenswrapper[4771]: done Mar 19 15:17:22 crc kubenswrapper[4771]: if [[ $rc -ne 0 ]]; then Mar 19 15:17:22 crc kubenswrapper[4771]: sleep 60 & wait Mar 19 15:17:22 crc kubenswrapper[4771]: continue Mar 19 15:17:22 crc kubenswrapper[4771]: fi Mar 19 15:17:22 crc kubenswrapper[4771]: Mar 19 15:17:22 crc kubenswrapper[4771]: Mar 19 15:17:22 crc kubenswrapper[4771]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 19 15:17:22 crc kubenswrapper[4771]: # Replace /etc/hosts with our modified version if needed Mar 19 15:17:22 crc kubenswrapper[4771]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 19 15:17:22 crc kubenswrapper[4771]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 19 15:17:22 crc kubenswrapper[4771]: fi Mar 19 15:17:22 crc kubenswrapper[4771]: sleep 60 & wait Mar 19 15:17:22 crc kubenswrapper[4771]: unset svc_ips Mar 19 15:17:22 crc kubenswrapper[4771]: done Mar 19 15:17:22 crc kubenswrapper[4771]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jzjlt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-hg7b2_openshift-dns(ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 15:17:22 crc kubenswrapper[4771]: > logger="UnhandledError" Mar 19 15:17:22 crc kubenswrapper[4771]: E0319 15:17:22.693920 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-hg7b2" podUID="ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.696782 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.701182 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.701191 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.701426 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.701624 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.702109 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.702832 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.707433 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.707479 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.707492 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.707510 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.707523 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:22Z","lastTransitionTime":"2026-03-19T15:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.709711 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.711256 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-host-var-lib-kubelet\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.711287 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-cnibin\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.711318 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-cni-binary-copy\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.711337 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rt68\" (UniqueName: \"kubernetes.io/projected/7afaaec8-b9d9-4b61-8bd2-3517ef7de1db-kube-api-access-8rt68\") pod \"multus-additional-cni-plugins-nmdkf\" (UID: \"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\") " pod="openshift-multus/multus-additional-cni-plugins-nmdkf" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.711358 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-multus-socket-dir-parent\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.711379 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-host-run-multus-certs\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.711402 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-host-var-lib-cni-bin\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.711423 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-host-var-lib-cni-multus\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.711443 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7afaaec8-b9d9-4b61-8bd2-3517ef7de1db-os-release\") pod \"multus-additional-cni-plugins-nmdkf\" (UID: \"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\") " pod="openshift-multus/multus-additional-cni-plugins-nmdkf" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.711467 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f2b6e948-bbef-4217-b0eb-4cdbf711037c-proxy-tls\") pod \"machine-config-daemon-wqbzp\" (UID: \"f2b6e948-bbef-4217-b0eb-4cdbf711037c\") " pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.711583 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-multus-daemon-config\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.711830 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7afaaec8-b9d9-4b61-8bd2-3517ef7de1db-cni-binary-copy\") pod \"multus-additional-cni-plugins-nmdkf\" (UID: \"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\") " pod="openshift-multus/multus-additional-cni-plugins-nmdkf" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.711979 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7afaaec8-b9d9-4b61-8bd2-3517ef7de1db-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nmdkf\" (UID: \"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\") " pod="openshift-multus/multus-additional-cni-plugins-nmdkf" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.712122 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgtqq\" (UniqueName: \"kubernetes.io/projected/f2b6e948-bbef-4217-b0eb-4cdbf711037c-kube-api-access-wgtqq\") pod \"machine-config-daemon-wqbzp\" (UID: \"f2b6e948-bbef-4217-b0eb-4cdbf711037c\") " pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.712156 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7afaaec8-b9d9-4b61-8bd2-3517ef7de1db-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nmdkf\" (UID: \"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\") " pod="openshift-multus/multus-additional-cni-plugins-nmdkf" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.712225 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f2b6e948-bbef-4217-b0eb-4cdbf711037c-rootfs\") pod \"machine-config-daemon-wqbzp\" (UID: \"f2b6e948-bbef-4217-b0eb-4cdbf711037c\") " pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.712250 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-multus-cni-dir\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.712270 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-host-run-k8s-cni-cncf-io\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.712325 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f2b6e948-bbef-4217-b0eb-4cdbf711037c-mcd-auth-proxy-config\") pod \"machine-config-daemon-wqbzp\" (UID: \"f2b6e948-bbef-4217-b0eb-4cdbf711037c\") " pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.712353 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-etc-kubernetes\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.712401 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-os-release\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.712439 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7afaaec8-b9d9-4b61-8bd2-3517ef7de1db-system-cni-dir\") pod \"multus-additional-cni-plugins-nmdkf\" (UID: \"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\") " pod="openshift-multus/multus-additional-cni-plugins-nmdkf" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.712496 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7afaaec8-b9d9-4b61-8bd2-3517ef7de1db-cnibin\") pod \"multus-additional-cni-plugins-nmdkf\" (UID: \"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\") " pod="openshift-multus/multus-additional-cni-plugins-nmdkf" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.712517 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-multus-conf-dir\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.712568 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26lvb\" (UniqueName: \"kubernetes.io/projected/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-kube-api-access-26lvb\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.712601 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-system-cni-dir\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.712621 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-hostroot\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.712673 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-host-run-netns\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.724553 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.738331 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.750637 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.765504 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.777875 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.791031 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.799076 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.807923 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.809852 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.809913 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.809933 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.809964 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.809982 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:22Z","lastTransitionTime":"2026-03-19T15:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.813733 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7afaaec8-b9d9-4b61-8bd2-3517ef7de1db-cni-binary-copy\") pod \"multus-additional-cni-plugins-nmdkf\" (UID: \"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\") " pod="openshift-multus/multus-additional-cni-plugins-nmdkf" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.813776 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7afaaec8-b9d9-4b61-8bd2-3517ef7de1db-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nmdkf\" (UID: \"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\") " pod="openshift-multus/multus-additional-cni-plugins-nmdkf" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.813799 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgtqq\" (UniqueName: \"kubernetes.io/projected/f2b6e948-bbef-4217-b0eb-4cdbf711037c-kube-api-access-wgtqq\") pod \"machine-config-daemon-wqbzp\" (UID: \"f2b6e948-bbef-4217-b0eb-4cdbf711037c\") " pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.813822 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7afaaec8-b9d9-4b61-8bd2-3517ef7de1db-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nmdkf\" (UID: \"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\") " pod="openshift-multus/multus-additional-cni-plugins-nmdkf" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.813844 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f2b6e948-bbef-4217-b0eb-4cdbf711037c-rootfs\") pod \"machine-config-daemon-wqbzp\" (UID: \"f2b6e948-bbef-4217-b0eb-4cdbf711037c\") " pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.813862 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-multus-cni-dir\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.813879 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-host-run-k8s-cni-cncf-io\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.813896 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f2b6e948-bbef-4217-b0eb-4cdbf711037c-mcd-auth-proxy-config\") pod \"machine-config-daemon-wqbzp\" (UID: \"f2b6e948-bbef-4217-b0eb-4cdbf711037c\") " pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.813915 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-etc-kubernetes\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.813934 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7afaaec8-b9d9-4b61-8bd2-3517ef7de1db-system-cni-dir\") pod \"multus-additional-cni-plugins-nmdkf\" (UID: \"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\") " pod="openshift-multus/multus-additional-cni-plugins-nmdkf" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.813936 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f2b6e948-bbef-4217-b0eb-4cdbf711037c-rootfs\") pod \"machine-config-daemon-wqbzp\" (UID: \"f2b6e948-bbef-4217-b0eb-4cdbf711037c\") " pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.813951 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-os-release\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814006 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7afaaec8-b9d9-4b61-8bd2-3517ef7de1db-cnibin\") pod \"multus-additional-cni-plugins-nmdkf\" (UID: \"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\") " pod="openshift-multus/multus-additional-cni-plugins-nmdkf" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814025 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-multus-conf-dir\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814041 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26lvb\" (UniqueName: \"kubernetes.io/projected/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-kube-api-access-26lvb\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814069 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-system-cni-dir\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814089 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-hostroot\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814110 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-host-run-netns\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814138 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-cni-binary-copy\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814157 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-host-var-lib-kubelet\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814172 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-cnibin\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814182 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-os-release\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814211 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-hostroot\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814185 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-multus-cni-dir\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814191 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rt68\" (UniqueName: \"kubernetes.io/projected/7afaaec8-b9d9-4b61-8bd2-3517ef7de1db-kube-api-access-8rt68\") pod \"multus-additional-cni-plugins-nmdkf\" (UID: \"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\") " pod="openshift-multus/multus-additional-cni-plugins-nmdkf" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814281 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-multus-conf-dir\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814294 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-multus-socket-dir-parent\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814341 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-host-var-lib-cni-multus\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814380 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-host-run-multus-certs\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814411 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-host-var-lib-cni-bin\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814423 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-system-cni-dir\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814448 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7afaaec8-b9d9-4b61-8bd2-3517ef7de1db-os-release\") pod \"multus-additional-cni-plugins-nmdkf\" (UID: \"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\") " pod="openshift-multus/multus-additional-cni-plugins-nmdkf" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814472 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-multus-socket-dir-parent\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814479 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-host-run-netns\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814488 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f2b6e948-bbef-4217-b0eb-4cdbf711037c-proxy-tls\") pod \"machine-config-daemon-wqbzp\" (UID: \"f2b6e948-bbef-4217-b0eb-4cdbf711037c\") " pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814517 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-multus-daemon-config\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814639 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7afaaec8-b9d9-4b61-8bd2-3517ef7de1db-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nmdkf\" (UID: \"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\") " pod="openshift-multus/multus-additional-cni-plugins-nmdkf" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814743 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-host-run-k8s-cni-cncf-io\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814810 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-etc-kubernetes\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814878 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-host-run-multus-certs\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814915 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7afaaec8-b9d9-4b61-8bd2-3517ef7de1db-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nmdkf\" (UID: \"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\") " pod="openshift-multus/multus-additional-cni-plugins-nmdkf" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814950 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-host-var-lib-cni-multus\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.814964 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7afaaec8-b9d9-4b61-8bd2-3517ef7de1db-system-cni-dir\") pod \"multus-additional-cni-plugins-nmdkf\" (UID: \"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\") " pod="openshift-multus/multus-additional-cni-plugins-nmdkf" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.815035 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-cnibin\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.815047 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7afaaec8-b9d9-4b61-8bd2-3517ef7de1db-cni-binary-copy\") pod \"multus-additional-cni-plugins-nmdkf\" (UID: \"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\") " pod="openshift-multus/multus-additional-cni-plugins-nmdkf" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.815066 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-host-var-lib-cni-bin\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.815061 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-host-var-lib-kubelet\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.815127 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7afaaec8-b9d9-4b61-8bd2-3517ef7de1db-os-release\") pod \"multus-additional-cni-plugins-nmdkf\" (UID: \"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\") " pod="openshift-multus/multus-additional-cni-plugins-nmdkf" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.815223 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-cni-binary-copy\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.815266 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-multus-daemon-config\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.815323 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f2b6e948-bbef-4217-b0eb-4cdbf711037c-mcd-auth-proxy-config\") pod \"machine-config-daemon-wqbzp\" (UID: \"f2b6e948-bbef-4217-b0eb-4cdbf711037c\") " pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.816561 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7afaaec8-b9d9-4b61-8bd2-3517ef7de1db-cnibin\") pod \"multus-additional-cni-plugins-nmdkf\" (UID: \"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\") " pod="openshift-multus/multus-additional-cni-plugins-nmdkf" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.817030 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.819869 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f2b6e948-bbef-4217-b0eb-4cdbf711037c-proxy-tls\") pod \"machine-config-daemon-wqbzp\" (UID: \"f2b6e948-bbef-4217-b0eb-4cdbf711037c\") " pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.827433 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.831254 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgtqq\" (UniqueName: \"kubernetes.io/projected/f2b6e948-bbef-4217-b0eb-4cdbf711037c-kube-api-access-wgtqq\") pod \"machine-config-daemon-wqbzp\" (UID: \"f2b6e948-bbef-4217-b0eb-4cdbf711037c\") " pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.832339 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26lvb\" (UniqueName: \"kubernetes.io/projected/51f8c2de-454d-4b7c-bf30-2f5d12d7088e-kube-api-access-26lvb\") pod \"multus-9989m\" (UID: \"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\") " pod="openshift-multus/multus-9989m" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.834114 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rt68\" (UniqueName: \"kubernetes.io/projected/7afaaec8-b9d9-4b61-8bd2-3517ef7de1db-kube-api-access-8rt68\") pod \"multus-additional-cni-plugins-nmdkf\" (UID: \"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\") " pod="openshift-multus/multus-additional-cni-plugins-nmdkf" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.837154 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.848625 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.859205 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.869425 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.879919 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.890172 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.902043 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.913349 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.913390 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.913403 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.913420 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.913433 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:22Z","lastTransitionTime":"2026-03-19T15:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.914152 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.924785 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.925350 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hg7b2" event={"ID":"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc","Type":"ContainerStarted","Data":"103752713f824bb33d5d91615cf91992ac41ba23ff8693376fa1b005499c7634"} Mar 19 15:17:22 crc kubenswrapper[4771]: E0319 15:17:22.927089 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 15:17:22 crc kubenswrapper[4771]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 19 15:17:22 crc kubenswrapper[4771]: set -uo pipefail Mar 19 15:17:22 crc kubenswrapper[4771]: Mar 19 15:17:22 crc kubenswrapper[4771]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 19 15:17:22 crc kubenswrapper[4771]: Mar 19 15:17:22 crc kubenswrapper[4771]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 19 15:17:22 crc kubenswrapper[4771]: HOSTS_FILE="/etc/hosts" Mar 19 15:17:22 crc kubenswrapper[4771]: TEMP_FILE="/etc/hosts.tmp" Mar 19 15:17:22 crc kubenswrapper[4771]: Mar 19 15:17:22 crc kubenswrapper[4771]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 19 15:17:22 crc kubenswrapper[4771]: Mar 19 15:17:22 crc kubenswrapper[4771]: # Make a temporary file with the old hosts file's attributes. Mar 19 15:17:22 crc kubenswrapper[4771]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 19 15:17:22 crc kubenswrapper[4771]: echo "Failed to preserve hosts file. Exiting." Mar 19 15:17:22 crc kubenswrapper[4771]: exit 1 Mar 19 15:17:22 crc kubenswrapper[4771]: fi Mar 19 15:17:22 crc kubenswrapper[4771]: Mar 19 15:17:22 crc kubenswrapper[4771]: while true; do Mar 19 15:17:22 crc kubenswrapper[4771]: declare -A svc_ips Mar 19 15:17:22 crc kubenswrapper[4771]: for svc in "${services[@]}"; do Mar 19 15:17:22 crc kubenswrapper[4771]: # Fetch service IP from cluster dns if present. We make several tries Mar 19 15:17:22 crc kubenswrapper[4771]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 19 15:17:22 crc kubenswrapper[4771]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 19 15:17:22 crc kubenswrapper[4771]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 19 15:17:22 crc kubenswrapper[4771]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 19 15:17:22 crc kubenswrapper[4771]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 19 15:17:22 crc kubenswrapper[4771]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 19 15:17:22 crc kubenswrapper[4771]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 19 15:17:22 crc kubenswrapper[4771]: for i in ${!cmds[*]} Mar 19 15:17:22 crc kubenswrapper[4771]: do Mar 19 15:17:22 crc kubenswrapper[4771]: ips=($(eval "${cmds[i]}")) Mar 19 15:17:22 crc kubenswrapper[4771]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 19 15:17:22 crc kubenswrapper[4771]: svc_ips["${svc}"]="${ips[@]}" Mar 19 15:17:22 crc kubenswrapper[4771]: break Mar 19 15:17:22 crc kubenswrapper[4771]: fi Mar 19 15:17:22 crc kubenswrapper[4771]: done Mar 19 15:17:22 crc kubenswrapper[4771]: done Mar 19 15:17:22 crc kubenswrapper[4771]: Mar 19 15:17:22 crc kubenswrapper[4771]: # Update /etc/hosts only if we get valid service IPs Mar 19 15:17:22 crc kubenswrapper[4771]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 19 15:17:22 crc kubenswrapper[4771]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 19 15:17:22 crc kubenswrapper[4771]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 19 15:17:22 crc kubenswrapper[4771]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 19 15:17:22 crc kubenswrapper[4771]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 19 15:17:22 crc kubenswrapper[4771]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 19 15:17:22 crc kubenswrapper[4771]: sleep 60 & wait Mar 19 15:17:22 crc kubenswrapper[4771]: continue Mar 19 15:17:22 crc kubenswrapper[4771]: fi Mar 19 15:17:22 crc kubenswrapper[4771]: Mar 19 15:17:22 crc kubenswrapper[4771]: # Append resolver entries for services Mar 19 15:17:22 crc kubenswrapper[4771]: rc=0 Mar 19 15:17:22 crc kubenswrapper[4771]: for svc in "${!svc_ips[@]}"; do Mar 19 15:17:22 crc kubenswrapper[4771]: for ip in ${svc_ips[${svc}]}; do Mar 19 15:17:22 crc kubenswrapper[4771]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 19 15:17:22 crc kubenswrapper[4771]: done Mar 19 15:17:22 crc kubenswrapper[4771]: done Mar 19 15:17:22 crc kubenswrapper[4771]: if [[ $rc -ne 0 ]]; then Mar 19 15:17:22 crc kubenswrapper[4771]: sleep 60 & wait Mar 19 15:17:22 crc kubenswrapper[4771]: continue Mar 19 15:17:22 crc kubenswrapper[4771]: fi Mar 19 15:17:22 crc kubenswrapper[4771]: Mar 19 15:17:22 crc kubenswrapper[4771]: Mar 19 15:17:22 crc kubenswrapper[4771]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 19 15:17:22 crc kubenswrapper[4771]: # Replace /etc/hosts with our modified version if needed Mar 19 15:17:22 crc kubenswrapper[4771]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 19 15:17:22 crc kubenswrapper[4771]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 19 15:17:22 crc kubenswrapper[4771]: fi Mar 19 15:17:22 crc kubenswrapper[4771]: sleep 60 & wait Mar 19 15:17:22 crc kubenswrapper[4771]: unset svc_ips Mar 19 15:17:22 crc kubenswrapper[4771]: done Mar 19 15:17:22 crc kubenswrapper[4771]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jzjlt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-hg7b2_openshift-dns(ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 15:17:22 crc kubenswrapper[4771]: > logger="UnhandledError" Mar 19 15:17:22 crc kubenswrapper[4771]: E0319 15:17:22.928327 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-hg7b2" podUID="ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.934749 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.944825 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.955297 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.966360 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.978058 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:22 crc kubenswrapper[4771]: I0319 15:17:22.989284 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.000827 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.013118 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.016228 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.016280 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.016292 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.016312 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.016325 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:23Z","lastTransitionTime":"2026-03-19T15:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.024338 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.026449 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.033632 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.035665 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9989m" Mar 19 15:17:23 crc kubenswrapper[4771]: W0319 15:17:23.036145 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2b6e948_bbef_4217_b0eb_4cdbf711037c.slice/crio-81f0a13165907da485e3b16aa4d91a9c2ba804f351b3a48422b51ba4f7973cb3 WatchSource:0}: Error finding container 81f0a13165907da485e3b16aa4d91a9c2ba804f351b3a48422b51ba4f7973cb3: Status 404 returned error can't find the container with id 81f0a13165907da485e3b16aa4d91a9c2ba804f351b3a48422b51ba4f7973cb3 Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.037951 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wgtqq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.040693 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wgtqq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.041948 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.045520 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.048447 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.052106 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 15:17:23 crc kubenswrapper[4771]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 19 15:17:23 crc kubenswrapper[4771]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 19 15:17:23 crc kubenswrapper[4771]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-26lvb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-9989m_openshift-multus(51f8c2de-454d-4b7c-bf30-2f5d12d7088e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 15:17:23 crc kubenswrapper[4771]: > logger="UnhandledError" Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.053202 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-9989m" podUID="51f8c2de-454d-4b7c-bf30-2f5d12d7088e" Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.067006 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8rt68,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-nmdkf_openshift-multus(7afaaec8-b9d9-4b61-8bd2-3517ef7de1db): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.067294 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-b6zx4"] Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.068172 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" podUID="7afaaec8-b9d9-4b61-8bd2-3517ef7de1db" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.068740 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.070241 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.070360 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.071217 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.071290 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.071221 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.071703 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.072118 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.077699 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.085141 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.095928 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.114300 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.116690 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-run-openvswitch\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.116751 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk5n5\" (UniqueName: \"kubernetes.io/projected/bf31981b-d437-4216-a275-5b566d8c49aa-kube-api-access-hk5n5\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.116788 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-run-netns\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.117298 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-cni-bin\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.117352 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bf31981b-d437-4216-a275-5b566d8c49aa-ovnkube-script-lib\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.117390 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bf31981b-d437-4216-a275-5b566d8c49aa-ovnkube-config\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.117420 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-run-ovn\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.117449 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-node-log\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.117476 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-run-ovn-kubernetes\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.117540 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-systemd-units\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.117576 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-slash\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.117604 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-run-systemd\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.117636 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-etc-openvswitch\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.117664 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-cni-netd\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.117725 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bf31981b-d437-4216-a275-5b566d8c49aa-env-overrides\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.117759 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bf31981b-d437-4216-a275-5b566d8c49aa-ovn-node-metrics-cert\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.117790 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-kubelet\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.117843 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-var-lib-openvswitch\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.117870 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-log-socket\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.117914 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.119723 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.119859 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.119957 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.120075 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.120156 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:23Z","lastTransitionTime":"2026-03-19T15:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.127509 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.144113 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.157597 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.168093 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.178094 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.189973 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.200601 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.209493 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.218832 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.219015 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-slash\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.219050 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:17:31.219018978 +0000 UTC m=+110.447640220 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.219194 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-run-systemd\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.219268 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-run-systemd\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.219120 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-slash\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.219404 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-etc-openvswitch\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.219514 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-cni-netd\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.219476 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-etc-openvswitch\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.219593 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bf31981b-d437-4216-a275-5b566d8c49aa-env-overrides\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.219658 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bf31981b-d437-4216-a275-5b566d8c49aa-ovn-node-metrics-cert\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.219663 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-cni-netd\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.219685 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-kubelet\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.219713 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-kubelet\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.219741 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.219793 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.219803 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-var-lib-openvswitch\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.219835 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-log-socket\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.219853 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 15:17:31.21983621 +0000 UTC m=+110.448457412 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.219886 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.219909 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-var-lib-openvswitch\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.219927 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-run-openvswitch\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.219956 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk5n5\" (UniqueName: \"kubernetes.io/projected/bf31981b-d437-4216-a275-5b566d8c49aa-kube-api-access-hk5n5\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.219970 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-log-socket\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.220032 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-run-netns\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.220095 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-cni-bin\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.220078 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-run-openvswitch\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.220135 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-cni-bin\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.220087 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-run-netns\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.220149 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.219980 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.220262 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.220262 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bf31981b-d437-4216-a275-5b566d8c49aa-ovnkube-script-lib\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.220367 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 15:17:31.220344464 +0000 UTC m=+110.448965706 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.220432 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bf31981b-d437-4216-a275-5b566d8c49aa-ovnkube-config\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.220476 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-run-ovn\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.220544 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-node-log\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.220614 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-run-ovn-kubernetes\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.220646 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-systemd-units\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.220713 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-run-ovn\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.220735 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-node-log\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.220728 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-systemd-units\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.220750 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-run-ovn-kubernetes\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.221088 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bf31981b-d437-4216-a275-5b566d8c49aa-ovnkube-script-lib\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.222800 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bf31981b-d437-4216-a275-5b566d8c49aa-env-overrides\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.222966 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bf31981b-d437-4216-a275-5b566d8c49aa-ovnkube-config\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.223303 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.223426 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.223529 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.223628 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.223459 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bf31981b-d437-4216-a275-5b566d8c49aa-ovn-node-metrics-cert\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.223721 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:23Z","lastTransitionTime":"2026-03-19T15:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.236965 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk5n5\" (UniqueName: \"kubernetes.io/projected/bf31981b-d437-4216-a275-5b566d8c49aa-kube-api-access-hk5n5\") pod \"ovnkube-node-b6zx4\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.321693 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.321761 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.321933 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.322014 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.322036 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.322112 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 15:17:31.322086566 +0000 UTC m=+110.550707808 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.321933 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.322180 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.322208 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.322311 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 15:17:31.322280501 +0000 UTC m=+110.550901743 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.326538 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.326695 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.326863 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.326945 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.327028 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:23Z","lastTransitionTime":"2026-03-19T15:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.388382 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.413850 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 15:17:23 crc kubenswrapper[4771]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 19 15:17:23 crc kubenswrapper[4771]: apiVersion: v1 Mar 19 15:17:23 crc kubenswrapper[4771]: clusters: Mar 19 15:17:23 crc kubenswrapper[4771]: - cluster: Mar 19 15:17:23 crc kubenswrapper[4771]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 19 15:17:23 crc kubenswrapper[4771]: server: https://api-int.crc.testing:6443 Mar 19 15:17:23 crc kubenswrapper[4771]: name: default-cluster Mar 19 15:17:23 crc kubenswrapper[4771]: contexts: Mar 19 15:17:23 crc kubenswrapper[4771]: - context: Mar 19 15:17:23 crc kubenswrapper[4771]: cluster: default-cluster Mar 19 15:17:23 crc kubenswrapper[4771]: namespace: default Mar 19 15:17:23 crc kubenswrapper[4771]: user: default-auth Mar 19 15:17:23 crc kubenswrapper[4771]: name: default-context Mar 19 15:17:23 crc kubenswrapper[4771]: current-context: default-context Mar 19 15:17:23 crc kubenswrapper[4771]: kind: Config Mar 19 15:17:23 crc kubenswrapper[4771]: preferences: {} Mar 19 15:17:23 crc kubenswrapper[4771]: users: Mar 19 15:17:23 crc kubenswrapper[4771]: - name: default-auth Mar 19 15:17:23 crc kubenswrapper[4771]: user: Mar 19 15:17:23 crc kubenswrapper[4771]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 19 15:17:23 crc kubenswrapper[4771]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 19 15:17:23 crc kubenswrapper[4771]: EOF Mar 19 15:17:23 crc kubenswrapper[4771]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hk5n5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-b6zx4_openshift-ovn-kubernetes(bf31981b-d437-4216-a275-5b566d8c49aa): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 15:17:23 crc kubenswrapper[4771]: > logger="UnhandledError" Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.415030 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.429220 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.429246 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.429254 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.429268 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.429278 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:23Z","lastTransitionTime":"2026-03-19T15:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.507824 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.508285 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.508307 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.508393 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.539116 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.539185 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.539197 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.539220 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.539235 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:23Z","lastTransitionTime":"2026-03-19T15:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.641957 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.642451 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.642612 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.642740 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.642868 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:23Z","lastTransitionTime":"2026-03-19T15:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.745788 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.745850 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.745873 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.745900 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.745921 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:23Z","lastTransitionTime":"2026-03-19T15:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.848725 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.848762 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.848773 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.848790 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.848802 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:23Z","lastTransitionTime":"2026-03-19T15:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.929458 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" event={"ID":"bf31981b-d437-4216-a275-5b566d8c49aa","Type":"ContainerStarted","Data":"69e97e6616d71e73668c2c7097c1536469c45c0d2233f02aa729ec54cc483386"} Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.931055 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" event={"ID":"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db","Type":"ContainerStarted","Data":"2dad7fc28838e9f36668da2e4510c16de45332a79001726ecdc0356776bcc30e"} Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.932519 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 15:17:23 crc kubenswrapper[4771]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 19 15:17:23 crc kubenswrapper[4771]: apiVersion: v1 Mar 19 15:17:23 crc kubenswrapper[4771]: clusters: Mar 19 15:17:23 crc kubenswrapper[4771]: - cluster: Mar 19 15:17:23 crc kubenswrapper[4771]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 19 15:17:23 crc kubenswrapper[4771]: server: https://api-int.crc.testing:6443 Mar 19 15:17:23 crc kubenswrapper[4771]: name: default-cluster Mar 19 15:17:23 crc kubenswrapper[4771]: contexts: Mar 19 15:17:23 crc kubenswrapper[4771]: - context: Mar 19 15:17:23 crc kubenswrapper[4771]: cluster: default-cluster Mar 19 15:17:23 crc kubenswrapper[4771]: namespace: default Mar 19 15:17:23 crc kubenswrapper[4771]: user: default-auth Mar 19 15:17:23 crc kubenswrapper[4771]: name: default-context Mar 19 15:17:23 crc kubenswrapper[4771]: current-context: default-context Mar 19 15:17:23 crc kubenswrapper[4771]: kind: Config Mar 19 15:17:23 crc kubenswrapper[4771]: preferences: {} Mar 19 15:17:23 crc kubenswrapper[4771]: users: Mar 19 15:17:23 crc kubenswrapper[4771]: - name: default-auth Mar 19 15:17:23 crc kubenswrapper[4771]: user: Mar 19 15:17:23 crc kubenswrapper[4771]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 19 15:17:23 crc kubenswrapper[4771]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 19 15:17:23 crc kubenswrapper[4771]: EOF Mar 19 15:17:23 crc kubenswrapper[4771]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hk5n5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-b6zx4_openshift-ovn-kubernetes(bf31981b-d437-4216-a275-5b566d8c49aa): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 15:17:23 crc kubenswrapper[4771]: > logger="UnhandledError" Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.933758 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.934973 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9989m" event={"ID":"51f8c2de-454d-4b7c-bf30-2f5d12d7088e","Type":"ContainerStarted","Data":"e229c5e152cba41ad9e556a55bfd4a72c201d302d24a666e571dc1645f0ca823"} Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.936540 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" event={"ID":"f2b6e948-bbef-4217-b0eb-4cdbf711037c","Type":"ContainerStarted","Data":"81f0a13165907da485e3b16aa4d91a9c2ba804f351b3a48422b51ba4f7973cb3"} Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.936606 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8rt68,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-nmdkf_openshift-multus(7afaaec8-b9d9-4b61-8bd2-3517ef7de1db): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.938162 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" podUID="7afaaec8-b9d9-4b61-8bd2-3517ef7de1db" Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.939626 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 15:17:23 crc kubenswrapper[4771]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 19 15:17:23 crc kubenswrapper[4771]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 19 15:17:23 crc kubenswrapper[4771]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-26lvb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-9989m_openshift-multus(51f8c2de-454d-4b7c-bf30-2f5d12d7088e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 15:17:23 crc kubenswrapper[4771]: > logger="UnhandledError" Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.940960 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-9989m" podUID="51f8c2de-454d-4b7c-bf30-2f5d12d7088e" Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.941286 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wgtqq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.943731 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wgtqq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 15:17:23 crc kubenswrapper[4771]: E0319 15:17:23.945039 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.951536 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.951503 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.951586 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.951719 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.951749 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.951771 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:23Z","lastTransitionTime":"2026-03-19T15:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.964418 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.978004 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.986483 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:23 crc kubenswrapper[4771]: I0319 15:17:23.995931 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.011835 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.018821 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.028867 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.039499 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.050051 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.053815 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.053882 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.053907 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.053939 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.053960 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:24Z","lastTransitionTime":"2026-03-19T15:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.065111 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.077783 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.086620 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.098238 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.110906 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.127247 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.142769 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.156212 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.157153 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.157315 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.157422 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.157517 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.157624 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:24Z","lastTransitionTime":"2026-03-19T15:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.170132 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.184805 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.207484 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.228221 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.238263 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.252227 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.260280 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.260403 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.260512 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.260668 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.260733 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:24Z","lastTransitionTime":"2026-03-19T15:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.363764 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.363820 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.363835 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.363857 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.363875 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:24Z","lastTransitionTime":"2026-03-19T15:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.466629 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.467067 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.467267 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.467530 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.467805 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:24Z","lastTransitionTime":"2026-03-19T15:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.508137 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:24 crc kubenswrapper[4771]: E0319 15:17:24.508323 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.570796 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.570853 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.570871 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.570912 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.570929 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:24Z","lastTransitionTime":"2026-03-19T15:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.673328 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.673402 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.673423 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.673452 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.673475 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:24Z","lastTransitionTime":"2026-03-19T15:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.776854 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.776915 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.776933 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.776957 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.776976 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:24Z","lastTransitionTime":"2026-03-19T15:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.880202 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.880260 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.880276 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.880303 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.880321 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:24Z","lastTransitionTime":"2026-03-19T15:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.983284 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.983321 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.983330 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.983345 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:24 crc kubenswrapper[4771]: I0319 15:17:24.983354 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:24Z","lastTransitionTime":"2026-03-19T15:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.086804 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.087703 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.087856 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.088055 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.088201 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:25Z","lastTransitionTime":"2026-03-19T15:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.190889 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.190936 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.190952 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.191012 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.191030 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:25Z","lastTransitionTime":"2026-03-19T15:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.294485 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.294806 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.294942 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.295158 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.295302 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:25Z","lastTransitionTime":"2026-03-19T15:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.398050 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.398340 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.398511 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.398717 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.398885 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:25Z","lastTransitionTime":"2026-03-19T15:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.457477 4771 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.501244 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.501297 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.501314 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.501338 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.501356 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:25Z","lastTransitionTime":"2026-03-19T15:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.508365 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.508416 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:25 crc kubenswrapper[4771]: E0319 15:17:25.508562 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:17:25 crc kubenswrapper[4771]: E0319 15:17:25.508696 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.604157 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.604203 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.604213 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.604229 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.604240 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:25Z","lastTransitionTime":"2026-03-19T15:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.707147 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.707206 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.707222 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.707245 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.707262 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:25Z","lastTransitionTime":"2026-03-19T15:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.809945 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.810003 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.810013 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.810030 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.810041 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:25Z","lastTransitionTime":"2026-03-19T15:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.912516 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.912566 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.912577 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.912594 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:25 crc kubenswrapper[4771]: I0319 15:17:25.912606 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:25Z","lastTransitionTime":"2026-03-19T15:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.015522 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.015570 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.015580 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.015596 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.015607 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:26Z","lastTransitionTime":"2026-03-19T15:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.118129 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.118227 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.118252 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.118279 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.118298 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:26Z","lastTransitionTime":"2026-03-19T15:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.220707 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.220745 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.220754 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.220788 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.220801 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:26Z","lastTransitionTime":"2026-03-19T15:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.323327 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.323827 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.324053 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.324199 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.324334 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:26Z","lastTransitionTime":"2026-03-19T15:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.427190 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.427524 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.427722 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.427896 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.428087 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:26Z","lastTransitionTime":"2026-03-19T15:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.508623 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:26 crc kubenswrapper[4771]: E0319 15:17:26.509139 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.530576 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.530616 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.530625 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.530640 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.530651 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:26Z","lastTransitionTime":"2026-03-19T15:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.632868 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.632925 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.632936 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.632950 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.632958 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:26Z","lastTransitionTime":"2026-03-19T15:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.735251 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.735315 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.735335 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.735364 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.735387 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:26Z","lastTransitionTime":"2026-03-19T15:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.838351 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.838413 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.838430 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.838455 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.838472 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:26Z","lastTransitionTime":"2026-03-19T15:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.940717 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.940796 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.940829 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.940859 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:26 crc kubenswrapper[4771]: I0319 15:17:26.940880 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:26Z","lastTransitionTime":"2026-03-19T15:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.043305 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.043370 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.043393 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.043422 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.043442 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:27Z","lastTransitionTime":"2026-03-19T15:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.146660 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.146722 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.146742 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.146765 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.146783 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:27Z","lastTransitionTime":"2026-03-19T15:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.249062 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.249098 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.249105 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.249119 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.249129 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:27Z","lastTransitionTime":"2026-03-19T15:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.351260 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.351315 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.351333 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.351356 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.351373 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:27Z","lastTransitionTime":"2026-03-19T15:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.454261 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.454300 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.454314 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.454331 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.454343 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:27Z","lastTransitionTime":"2026-03-19T15:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.508328 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:27 crc kubenswrapper[4771]: E0319 15:17:27.508469 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.508329 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:27 crc kubenswrapper[4771]: E0319 15:17:27.508619 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.557131 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.557180 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.557190 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.557205 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.557215 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:27Z","lastTransitionTime":"2026-03-19T15:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.659676 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.659733 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.659751 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.659774 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.659793 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:27Z","lastTransitionTime":"2026-03-19T15:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.762370 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.762434 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.762456 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.762480 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.762497 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:27Z","lastTransitionTime":"2026-03-19T15:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.865356 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.865401 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.865418 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.865433 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.865444 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:27Z","lastTransitionTime":"2026-03-19T15:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.967978 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.968072 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.968089 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.968112 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:27 crc kubenswrapper[4771]: I0319 15:17:27.968135 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:27Z","lastTransitionTime":"2026-03-19T15:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.071112 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.071158 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.071168 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.071183 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.071192 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:28Z","lastTransitionTime":"2026-03-19T15:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.173404 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.173443 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.173489 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.173505 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.173514 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:28Z","lastTransitionTime":"2026-03-19T15:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.275954 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.276287 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.276424 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.276556 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.276669 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:28Z","lastTransitionTime":"2026-03-19T15:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.380166 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.380209 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.380222 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.380237 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.380248 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:28Z","lastTransitionTime":"2026-03-19T15:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.483072 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.483458 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.483784 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.484020 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.484218 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:28Z","lastTransitionTime":"2026-03-19T15:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.508223 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:28 crc kubenswrapper[4771]: E0319 15:17:28.508837 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.588730 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.589194 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.589372 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.589581 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.589740 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:28Z","lastTransitionTime":"2026-03-19T15:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.693190 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.693595 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.693807 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.693949 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.694202 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:28Z","lastTransitionTime":"2026-03-19T15:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.797319 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.797646 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.797733 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.797821 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.797883 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:28Z","lastTransitionTime":"2026-03-19T15:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.887234 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-qhmqm"] Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.887679 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qhmqm" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.889871 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.890231 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.890380 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.892361 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.900542 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.900759 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.900952 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.901210 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.901415 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:28Z","lastTransitionTime":"2026-03-19T15:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.908183 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.933285 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.944099 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.955412 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.967754 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.982076 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96ee37da-7e5a-49de-bf2b-0857fa6f36b4-host\") pod \"node-ca-qhmqm\" (UID: \"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\") " pod="openshift-image-registry/node-ca-qhmqm" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.982143 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/96ee37da-7e5a-49de-bf2b-0857fa6f36b4-serviceca\") pod \"node-ca-qhmqm\" (UID: \"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\") " pod="openshift-image-registry/node-ca-qhmqm" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.982188 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcnzz\" (UniqueName: \"kubernetes.io/projected/96ee37da-7e5a-49de-bf2b-0857fa6f36b4-kube-api-access-zcnzz\") pod \"node-ca-qhmqm\" (UID: \"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\") " pod="openshift-image-registry/node-ca-qhmqm" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.982431 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:28 crc kubenswrapper[4771]: I0319 15:17:28.997046 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.004701 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.004743 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.004759 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.004781 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.004797 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:29Z","lastTransitionTime":"2026-03-19T15:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.011409 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.028245 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.044101 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.053107 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.068645 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.077901 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.083470 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/96ee37da-7e5a-49de-bf2b-0857fa6f36b4-serviceca\") pod \"node-ca-qhmqm\" (UID: \"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\") " pod="openshift-image-registry/node-ca-qhmqm" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.083551 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcnzz\" (UniqueName: \"kubernetes.io/projected/96ee37da-7e5a-49de-bf2b-0857fa6f36b4-kube-api-access-zcnzz\") pod \"node-ca-qhmqm\" (UID: \"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\") " pod="openshift-image-registry/node-ca-qhmqm" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.083646 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96ee37da-7e5a-49de-bf2b-0857fa6f36b4-host\") pod \"node-ca-qhmqm\" (UID: \"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\") " pod="openshift-image-registry/node-ca-qhmqm" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.083731 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96ee37da-7e5a-49de-bf2b-0857fa6f36b4-host\") pod \"node-ca-qhmqm\" (UID: \"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\") " pod="openshift-image-registry/node-ca-qhmqm" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.085497 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/96ee37da-7e5a-49de-bf2b-0857fa6f36b4-serviceca\") pod \"node-ca-qhmqm\" (UID: \"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\") " pod="openshift-image-registry/node-ca-qhmqm" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.102219 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcnzz\" (UniqueName: \"kubernetes.io/projected/96ee37da-7e5a-49de-bf2b-0857fa6f36b4-kube-api-access-zcnzz\") pod \"node-ca-qhmqm\" (UID: \"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\") " pod="openshift-image-registry/node-ca-qhmqm" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.107798 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.107822 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.107832 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.107847 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.107857 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:29Z","lastTransitionTime":"2026-03-19T15:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.201172 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qhmqm" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.210347 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.210399 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.210421 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.210452 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.210475 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:29Z","lastTransitionTime":"2026-03-19T15:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.213323 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.213348 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.213357 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.213369 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.213380 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:29Z","lastTransitionTime":"2026-03-19T15:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:29 crc kubenswrapper[4771]: E0319 15:17:29.232292 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.238646 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.238703 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.238720 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.238743 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.238761 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:29Z","lastTransitionTime":"2026-03-19T15:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:29 crc kubenswrapper[4771]: E0319 15:17:29.256426 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.261430 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.261555 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.261644 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.261713 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.261778 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:29Z","lastTransitionTime":"2026-03-19T15:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:29 crc kubenswrapper[4771]: E0319 15:17:29.271867 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.276236 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.276306 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.276329 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.276356 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.276375 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:29Z","lastTransitionTime":"2026-03-19T15:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:29 crc kubenswrapper[4771]: E0319 15:17:29.287098 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.292224 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.292283 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.292308 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.292340 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.292358 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:29Z","lastTransitionTime":"2026-03-19T15:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:29 crc kubenswrapper[4771]: E0319 15:17:29.312167 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:29 crc kubenswrapper[4771]: E0319 15:17:29.312313 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.314335 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.314368 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.314379 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.314396 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.314407 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:29Z","lastTransitionTime":"2026-03-19T15:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.417114 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.417160 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.417172 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.417189 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.417201 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:29Z","lastTransitionTime":"2026-03-19T15:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.508385 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.508493 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:29 crc kubenswrapper[4771]: E0319 15:17:29.508610 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:17:29 crc kubenswrapper[4771]: E0319 15:17:29.508852 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.522170 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.522209 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.522217 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.522232 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.522243 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:29Z","lastTransitionTime":"2026-03-19T15:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.624903 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.624941 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.624949 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.624964 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.624974 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:29Z","lastTransitionTime":"2026-03-19T15:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.727675 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.727751 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.727762 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.727799 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.727811 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:29Z","lastTransitionTime":"2026-03-19T15:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.830242 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.830503 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.830581 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.830656 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.830718 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:29Z","lastTransitionTime":"2026-03-19T15:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.932743 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.933079 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.933232 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.933372 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.933543 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:29Z","lastTransitionTime":"2026-03-19T15:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.954364 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qhmqm" event={"ID":"96ee37da-7e5a-49de-bf2b-0857fa6f36b4","Type":"ContainerStarted","Data":"5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633"} Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.954553 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qhmqm" event={"ID":"96ee37da-7e5a-49de-bf2b-0857fa6f36b4","Type":"ContainerStarted","Data":"e6080ff9a7cee253c475dac315202f6d6be5f384ba7a6c65536596ca61c964c4"} Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.964585 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.972823 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:29 crc kubenswrapper[4771]: I0319 15:17:29.981225 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.005139 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.012792 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.026321 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.035361 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.035399 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.035408 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.035422 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.035436 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:30Z","lastTransitionTime":"2026-03-19T15:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.037353 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.050031 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.060680 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.077052 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.086555 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.096277 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.108887 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.138444 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.138510 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.138528 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.138551 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.138568 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:30Z","lastTransitionTime":"2026-03-19T15:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.241608 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.241684 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.241711 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.241740 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.241763 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:30Z","lastTransitionTime":"2026-03-19T15:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.344663 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.344706 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.344719 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.344733 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.344743 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:30Z","lastTransitionTime":"2026-03-19T15:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.447739 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.447797 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.447816 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.447840 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.447857 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:30Z","lastTransitionTime":"2026-03-19T15:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.508059 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:30 crc kubenswrapper[4771]: E0319 15:17:30.508856 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.509245 4771 scope.go:117] "RemoveContainer" containerID="efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.550255 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.550312 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.550329 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.550352 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.550369 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:30Z","lastTransitionTime":"2026-03-19T15:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.652954 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.653015 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.653028 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.653045 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.653057 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:30Z","lastTransitionTime":"2026-03-19T15:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.755641 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.755685 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.755695 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.755711 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.755722 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:30Z","lastTransitionTime":"2026-03-19T15:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.858645 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.858685 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.858696 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.858711 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.858721 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:30Z","lastTransitionTime":"2026-03-19T15:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.959414 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca"} Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.959475 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147"} Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.960397 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.960443 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.960460 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.960482 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.960499 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:30Z","lastTransitionTime":"2026-03-19T15:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.962794 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.965123 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913"} Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.965711 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.980484 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:30 crc kubenswrapper[4771]: I0319 15:17:30.996776 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.010878 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.024614 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.039626 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.052409 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.063409 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.063456 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.063473 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.063496 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.063514 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:31Z","lastTransitionTime":"2026-03-19T15:17:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.068030 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.081190 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.088951 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.101536 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.116632 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.134687 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.142646 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.150718 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.159563 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.165388 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.165432 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.165443 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.165460 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.165472 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:31Z","lastTransitionTime":"2026-03-19T15:17:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.170957 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.186951 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.193870 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.202973 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.213401 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.223592 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.236223 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.251351 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.268587 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.268617 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.268626 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.268639 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.268648 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:31Z","lastTransitionTime":"2026-03-19T15:17:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.268654 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.282361 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.294057 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.304245 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.304344 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.304381 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:31 crc kubenswrapper[4771]: E0319 15:17:31.304474 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 15:17:31 crc kubenswrapper[4771]: E0319 15:17:31.304511 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 15:17:31 crc kubenswrapper[4771]: E0319 15:17:31.304514 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:17:47.304483556 +0000 UTC m=+126.533104778 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:17:31 crc kubenswrapper[4771]: E0319 15:17:31.304639 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 15:17:47.304605459 +0000 UTC m=+126.533226691 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 15:17:31 crc kubenswrapper[4771]: E0319 15:17:31.304677 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 15:17:47.30466206 +0000 UTC m=+126.533283372 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.370814 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.370883 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.370904 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.370930 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.370950 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:31Z","lastTransitionTime":"2026-03-19T15:17:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.405646 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.405711 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:31 crc kubenswrapper[4771]: E0319 15:17:31.405810 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 15:17:31 crc kubenswrapper[4771]: E0319 15:17:31.405843 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 15:17:31 crc kubenswrapper[4771]: E0319 15:17:31.405856 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:17:31 crc kubenswrapper[4771]: E0319 15:17:31.405894 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 15:17:31 crc kubenswrapper[4771]: E0319 15:17:31.405921 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 15:17:31 crc kubenswrapper[4771]: E0319 15:17:31.405941 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:17:31 crc kubenswrapper[4771]: E0319 15:17:31.405923 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 15:17:47.40590364 +0000 UTC m=+126.634524952 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:17:31 crc kubenswrapper[4771]: E0319 15:17:31.406029 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 15:17:47.406010412 +0000 UTC m=+126.634631674 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.473770 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.473826 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.473845 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.473869 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.473885 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:31Z","lastTransitionTime":"2026-03-19T15:17:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.508334 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.508370 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:31 crc kubenswrapper[4771]: E0319 15:17:31.508938 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:17:31 crc kubenswrapper[4771]: E0319 15:17:31.509060 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.527296 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.542877 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.570683 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.576100 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.576132 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.576143 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.576160 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.576171 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:31Z","lastTransitionTime":"2026-03-19T15:17:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.583958 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.595636 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.612055 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.628463 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.647382 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.661812 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.676780 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.678264 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.678293 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.678301 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.678316 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.678325 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:31Z","lastTransitionTime":"2026-03-19T15:17:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.688472 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.701784 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.721846 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.780035 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.780071 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.780083 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.780098 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.780108 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:31Z","lastTransitionTime":"2026-03-19T15:17:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.883617 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.883679 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.883691 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.883710 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.883723 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:31Z","lastTransitionTime":"2026-03-19T15:17:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.970498 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f"} Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.984101 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.986424 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.986465 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.986481 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.986503 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:31 crc kubenswrapper[4771]: I0319 15:17:31.986522 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:31Z","lastTransitionTime":"2026-03-19T15:17:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.002296 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:32Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.021867 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:32Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.047274 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:32Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.060981 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:32Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.080880 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:32Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.089943 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.090013 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.090034 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.090059 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.090085 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:32Z","lastTransitionTime":"2026-03-19T15:17:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.096474 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:32Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.111341 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:32Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.126781 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:32Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.142162 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:32Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.158517 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:32Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.176655 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:32Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.193664 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.193719 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.193735 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.193759 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.193775 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:32Z","lastTransitionTime":"2026-03-19T15:17:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.194274 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:32Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.297108 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.297195 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.297217 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.297242 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.297259 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:32Z","lastTransitionTime":"2026-03-19T15:17:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.400304 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.400598 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.400686 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.400776 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.400895 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:32Z","lastTransitionTime":"2026-03-19T15:17:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.504633 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.504692 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.504710 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.504736 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.504760 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:32Z","lastTransitionTime":"2026-03-19T15:17:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.508630 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:32 crc kubenswrapper[4771]: E0319 15:17:32.508848 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.607740 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.607786 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.607796 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.607810 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.607821 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:32Z","lastTransitionTime":"2026-03-19T15:17:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.710626 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.710701 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.710719 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.710747 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.710764 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:32Z","lastTransitionTime":"2026-03-19T15:17:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.813185 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.813243 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.813259 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.813282 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.813300 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:32Z","lastTransitionTime":"2026-03-19T15:17:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.915833 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.915881 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.915889 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.915904 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.915916 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:32Z","lastTransitionTime":"2026-03-19T15:17:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.974878 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03"} Mar 19 15:17:32 crc kubenswrapper[4771]: I0319 15:17:32.993880 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:32Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.010207 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:33Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.018114 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.018164 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.018180 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.018195 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.018206 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:33Z","lastTransitionTime":"2026-03-19T15:17:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.027329 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:33Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.043275 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:33Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.058207 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:33Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.080319 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:33Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.100229 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:33Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.116584 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:33Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.119852 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.119893 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.119907 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.119926 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.119937 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:33Z","lastTransitionTime":"2026-03-19T15:17:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.140745 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:33Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.154482 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:33Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.167740 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:33Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.182880 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:33Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.195080 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:33Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.223160 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.223214 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.223228 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.223247 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.223257 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:33Z","lastTransitionTime":"2026-03-19T15:17:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.325789 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.325845 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.325862 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.325887 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.325903 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:33Z","lastTransitionTime":"2026-03-19T15:17:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.428872 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.428927 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.428945 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.428971 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.429015 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:33Z","lastTransitionTime":"2026-03-19T15:17:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.507777 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.507936 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:33 crc kubenswrapper[4771]: E0319 15:17:33.508158 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:17:33 crc kubenswrapper[4771]: E0319 15:17:33.508338 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.531139 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.531175 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.531184 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.531199 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.531208 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:33Z","lastTransitionTime":"2026-03-19T15:17:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.634181 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.634224 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.634236 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.634252 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.634263 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:33Z","lastTransitionTime":"2026-03-19T15:17:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.737135 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.737486 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.737757 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.737851 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.737914 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:33Z","lastTransitionTime":"2026-03-19T15:17:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.840759 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.840803 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.840813 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.840827 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.840838 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:33Z","lastTransitionTime":"2026-03-19T15:17:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.943870 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.944231 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.944373 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.944528 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:33 crc kubenswrapper[4771]: I0319 15:17:33.944928 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:33Z","lastTransitionTime":"2026-03-19T15:17:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.047907 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.047932 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.047940 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.047953 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.047961 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:34Z","lastTransitionTime":"2026-03-19T15:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.149884 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.149947 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.149965 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.150031 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.150056 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:34Z","lastTransitionTime":"2026-03-19T15:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.252771 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.252828 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.252847 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.252868 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.252884 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:34Z","lastTransitionTime":"2026-03-19T15:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.355671 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.355708 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.355717 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.355732 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.355742 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:34Z","lastTransitionTime":"2026-03-19T15:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.457702 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.457740 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.457748 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.457762 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.457771 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:34Z","lastTransitionTime":"2026-03-19T15:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.508436 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:34 crc kubenswrapper[4771]: E0319 15:17:34.508617 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.560503 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.560563 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.560584 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.560607 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.560624 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:34Z","lastTransitionTime":"2026-03-19T15:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.663924 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.664331 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.664577 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.664737 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.665065 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:34Z","lastTransitionTime":"2026-03-19T15:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.756660 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp"] Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.757072 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.759898 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.764145 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.768134 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.768190 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.768203 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.768245 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.768258 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:34Z","lastTransitionTime":"2026-03-19T15:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.784138 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:34Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.795917 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:34Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.806979 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:34Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.823252 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:34Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.835363 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:34Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.847078 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:34Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.849623 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4k29\" (UniqueName: \"kubernetes.io/projected/52bde5c1-4714-4fff-bab9-3bbc84a71782-kube-api-access-v4k29\") pod \"ovnkube-control-plane-749d76644c-rgdpp\" (UID: \"52bde5c1-4714-4fff-bab9-3bbc84a71782\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.849689 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/52bde5c1-4714-4fff-bab9-3bbc84a71782-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rgdpp\" (UID: \"52bde5c1-4714-4fff-bab9-3bbc84a71782\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.849713 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/52bde5c1-4714-4fff-bab9-3bbc84a71782-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rgdpp\" (UID: \"52bde5c1-4714-4fff-bab9-3bbc84a71782\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.849757 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/52bde5c1-4714-4fff-bab9-3bbc84a71782-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rgdpp\" (UID: \"52bde5c1-4714-4fff-bab9-3bbc84a71782\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.859368 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:34Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.871815 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:34Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.871947 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.871999 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.872039 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.872056 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.872067 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:34Z","lastTransitionTime":"2026-03-19T15:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.883437 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:34Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.896316 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:34Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.907387 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:34Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.921467 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:34Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.934665 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:34Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.944732 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:34Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.951150 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4k29\" (UniqueName: \"kubernetes.io/projected/52bde5c1-4714-4fff-bab9-3bbc84a71782-kube-api-access-v4k29\") pod \"ovnkube-control-plane-749d76644c-rgdpp\" (UID: \"52bde5c1-4714-4fff-bab9-3bbc84a71782\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.951210 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/52bde5c1-4714-4fff-bab9-3bbc84a71782-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rgdpp\" (UID: \"52bde5c1-4714-4fff-bab9-3bbc84a71782\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.951230 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/52bde5c1-4714-4fff-bab9-3bbc84a71782-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rgdpp\" (UID: \"52bde5c1-4714-4fff-bab9-3bbc84a71782\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.951257 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/52bde5c1-4714-4fff-bab9-3bbc84a71782-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rgdpp\" (UID: \"52bde5c1-4714-4fff-bab9-3bbc84a71782\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.951811 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/52bde5c1-4714-4fff-bab9-3bbc84a71782-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rgdpp\" (UID: \"52bde5c1-4714-4fff-bab9-3bbc84a71782\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.952362 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/52bde5c1-4714-4fff-bab9-3bbc84a71782-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rgdpp\" (UID: \"52bde5c1-4714-4fff-bab9-3bbc84a71782\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.958365 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/52bde5c1-4714-4fff-bab9-3bbc84a71782-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rgdpp\" (UID: \"52bde5c1-4714-4fff-bab9-3bbc84a71782\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.968036 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4k29\" (UniqueName: \"kubernetes.io/projected/52bde5c1-4714-4fff-bab9-3bbc84a71782-kube-api-access-v4k29\") pod \"ovnkube-control-plane-749d76644c-rgdpp\" (UID: \"52bde5c1-4714-4fff-bab9-3bbc84a71782\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.973973 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.974014 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.974023 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.974035 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:34 crc kubenswrapper[4771]: I0319 15:17:34.974042 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:34Z","lastTransitionTime":"2026-03-19T15:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.076568 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.076607 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.076617 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.076632 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.076642 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:35Z","lastTransitionTime":"2026-03-19T15:17:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.086149 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" Mar 19 15:17:35 crc kubenswrapper[4771]: W0319 15:17:35.097523 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52bde5c1_4714_4fff_bab9_3bbc84a71782.slice/crio-f900fe9bc841a98a12b0251fe059ef2225189187ed98821811818db4436bb93f WatchSource:0}: Error finding container f900fe9bc841a98a12b0251fe059ef2225189187ed98821811818db4436bb93f: Status 404 returned error can't find the container with id f900fe9bc841a98a12b0251fe059ef2225189187ed98821811818db4436bb93f Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.180090 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.180123 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.180132 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.180145 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.180155 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:35Z","lastTransitionTime":"2026-03-19T15:17:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.282238 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.282275 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.282285 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.282304 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.282317 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:35Z","lastTransitionTime":"2026-03-19T15:17:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.383600 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.383638 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.383650 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.383666 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.383679 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:35Z","lastTransitionTime":"2026-03-19T15:17:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.485407 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.485468 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.485485 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.485511 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.485527 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:35Z","lastTransitionTime":"2026-03-19T15:17:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.508350 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.508398 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:35 crc kubenswrapper[4771]: E0319 15:17:35.508512 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:17:35 crc kubenswrapper[4771]: E0319 15:17:35.509063 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.517129 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-zjhnk"] Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.517679 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:17:35 crc kubenswrapper[4771]: E0319 15:17:35.517760 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.541359 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:35Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.562516 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:35Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.583585 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:35Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.589947 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.590001 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.590010 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.590024 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.590034 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:35Z","lastTransitionTime":"2026-03-19T15:17:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.596821 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:35Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.608950 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:35Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.619235 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:35Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.633414 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:35Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.649404 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:35Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.658492 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5dzj\" (UniqueName: \"kubernetes.io/projected/7fb3bb21-b72b-45e1-9b87-73f281abba90-kube-api-access-r5dzj\") pod \"network-metrics-daemon-zjhnk\" (UID: \"7fb3bb21-b72b-45e1-9b87-73f281abba90\") " pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.658902 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs\") pod \"network-metrics-daemon-zjhnk\" (UID: \"7fb3bb21-b72b-45e1-9b87-73f281abba90\") " pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.662738 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:35Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.683766 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:35Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.692757 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.692792 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.692801 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.692815 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.692826 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:35Z","lastTransitionTime":"2026-03-19T15:17:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.697005 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:35Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.717267 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:35Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.733688 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:35Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.745892 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:35Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.759382 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:35Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.759874 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5dzj\" (UniqueName: \"kubernetes.io/projected/7fb3bb21-b72b-45e1-9b87-73f281abba90-kube-api-access-r5dzj\") pod \"network-metrics-daemon-zjhnk\" (UID: \"7fb3bb21-b72b-45e1-9b87-73f281abba90\") " pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.759937 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs\") pod \"network-metrics-daemon-zjhnk\" (UID: \"7fb3bb21-b72b-45e1-9b87-73f281abba90\") " pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:17:35 crc kubenswrapper[4771]: E0319 15:17:35.760322 4771 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 15:17:35 crc kubenswrapper[4771]: E0319 15:17:35.760489 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs podName:7fb3bb21-b72b-45e1-9b87-73f281abba90 nodeName:}" failed. No retries permitted until 2026-03-19 15:17:36.260465112 +0000 UTC m=+115.489086334 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs") pod "network-metrics-daemon-zjhnk" (UID: "7fb3bb21-b72b-45e1-9b87-73f281abba90") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.776999 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5dzj\" (UniqueName: \"kubernetes.io/projected/7fb3bb21-b72b-45e1-9b87-73f281abba90-kube-api-access-r5dzj\") pod \"network-metrics-daemon-zjhnk\" (UID: \"7fb3bb21-b72b-45e1-9b87-73f281abba90\") " pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.795456 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.795506 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.795521 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.795538 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.795550 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:35Z","lastTransitionTime":"2026-03-19T15:17:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.897700 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.897737 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.897745 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.897761 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.897773 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:35Z","lastTransitionTime":"2026-03-19T15:17:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.983860 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" event={"ID":"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db","Type":"ContainerStarted","Data":"0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb"} Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.986654 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" event={"ID":"52bde5c1-4714-4fff-bab9-3bbc84a71782","Type":"ContainerStarted","Data":"fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535"} Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.986695 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" event={"ID":"52bde5c1-4714-4fff-bab9-3bbc84a71782","Type":"ContainerStarted","Data":"c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce"} Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.986707 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" event={"ID":"52bde5c1-4714-4fff-bab9-3bbc84a71782","Type":"ContainerStarted","Data":"f900fe9bc841a98a12b0251fe059ef2225189187ed98821811818db4436bb93f"} Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.999874 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.999916 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.999927 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.999942 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:35 crc kubenswrapper[4771]: I0319 15:17:35.999952 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:35Z","lastTransitionTime":"2026-03-19T15:17:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.005437 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.027103 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.041364 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.054864 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.067954 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.082103 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.097212 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.101523 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.101545 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.101553 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.101565 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.101575 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:36Z","lastTransitionTime":"2026-03-19T15:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.112668 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.128674 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.142426 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.157737 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.170959 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.185266 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.205213 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.205260 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.205270 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.205290 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.205301 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:36Z","lastTransitionTime":"2026-03-19T15:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.205906 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.224887 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.242559 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.265640 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs\") pod \"network-metrics-daemon-zjhnk\" (UID: \"7fb3bb21-b72b-45e1-9b87-73f281abba90\") " pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:17:36 crc kubenswrapper[4771]: E0319 15:17:36.265817 4771 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 15:17:36 crc kubenswrapper[4771]: E0319 15:17:36.265885 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs podName:7fb3bb21-b72b-45e1-9b87-73f281abba90 nodeName:}" failed. No retries permitted until 2026-03-19 15:17:37.265865933 +0000 UTC m=+116.494487135 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs") pod "network-metrics-daemon-zjhnk" (UID: "7fb3bb21-b72b-45e1-9b87-73f281abba90") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.267220 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.283859 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.299770 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.307670 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.307699 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.307711 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.307728 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.307743 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:36Z","lastTransitionTime":"2026-03-19T15:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.316611 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.327844 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.347505 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.359549 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.372589 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.384133 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.397727 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.406747 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.410027 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.410073 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.410093 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.410115 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.410132 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:36Z","lastTransitionTime":"2026-03-19T15:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.420214 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.433902 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.457505 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.508711 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:36 crc kubenswrapper[4771]: E0319 15:17:36.509260 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.512652 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.512709 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.512728 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.512752 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.512769 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:36Z","lastTransitionTime":"2026-03-19T15:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.615250 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.615313 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.615334 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.615358 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.615375 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:36Z","lastTransitionTime":"2026-03-19T15:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.719857 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.719885 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.719897 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.719913 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.719923 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:36Z","lastTransitionTime":"2026-03-19T15:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.822841 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.822895 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.822909 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.822929 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.822977 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:36Z","lastTransitionTime":"2026-03-19T15:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.926195 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.926248 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.926286 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.926315 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.926336 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:36Z","lastTransitionTime":"2026-03-19T15:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.991727 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9989m" event={"ID":"51f8c2de-454d-4b7c-bf30-2f5d12d7088e","Type":"ContainerStarted","Data":"b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9"} Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.994841 4771 generic.go:334] "Generic (PLEG): container finished" podID="7afaaec8-b9d9-4b61-8bd2-3517ef7de1db" containerID="0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb" exitCode=0 Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.994969 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" event={"ID":"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db","Type":"ContainerDied","Data":"0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb"} Mar 19 15:17:36 crc kubenswrapper[4771]: I0319 15:17:36.997524 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hg7b2" event={"ID":"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc","Type":"ContainerStarted","Data":"c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879"} Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.006492 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.022964 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.030281 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.030398 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.030423 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.030453 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.030477 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:37Z","lastTransitionTime":"2026-03-19T15:17:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.037631 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.066642 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.081751 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.100824 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.115734 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.131343 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.132374 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.132411 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.132427 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.132447 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.132461 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:37Z","lastTransitionTime":"2026-03-19T15:17:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.144853 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.160671 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.176827 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.193025 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.209327 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.224032 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.234952 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.235031 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.235047 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.235065 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.235079 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:37Z","lastTransitionTime":"2026-03-19T15:17:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.238148 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.251200 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.264398 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.277502 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.279163 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs\") pod \"network-metrics-daemon-zjhnk\" (UID: \"7fb3bb21-b72b-45e1-9b87-73f281abba90\") " pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:17:37 crc kubenswrapper[4771]: E0319 15:17:37.279354 4771 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 15:17:37 crc kubenswrapper[4771]: E0319 15:17:37.279449 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs podName:7fb3bb21-b72b-45e1-9b87-73f281abba90 nodeName:}" failed. No retries permitted until 2026-03-19 15:17:39.27942515 +0000 UTC m=+118.508046382 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs") pod "network-metrics-daemon-zjhnk" (UID: "7fb3bb21-b72b-45e1-9b87-73f281abba90") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.289694 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.301632 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.312823 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.325642 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.337745 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.337795 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.337808 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.337826 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.337838 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:37Z","lastTransitionTime":"2026-03-19T15:17:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.341925 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.353845 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.365552 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.375716 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.386385 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.397335 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.410949 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.440375 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.440413 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.440423 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.440435 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.440445 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:37Z","lastTransitionTime":"2026-03-19T15:17:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.441079 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.508476 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.508506 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.508533 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:17:37 crc kubenswrapper[4771]: E0319 15:17:37.508608 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:17:37 crc kubenswrapper[4771]: E0319 15:17:37.508977 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:17:37 crc kubenswrapper[4771]: E0319 15:17:37.509063 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.542291 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.542372 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.542396 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.542427 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.542450 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:37Z","lastTransitionTime":"2026-03-19T15:17:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.644882 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.645107 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.645265 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.645385 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.645509 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:37Z","lastTransitionTime":"2026-03-19T15:17:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.749113 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.749468 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.749479 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.749495 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.749507 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:37Z","lastTransitionTime":"2026-03-19T15:17:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.852200 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.852240 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.852250 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.852266 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.852276 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:37Z","lastTransitionTime":"2026-03-19T15:17:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.954911 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.954946 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.954955 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.954971 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:37 crc kubenswrapper[4771]: I0319 15:17:37.954980 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:37Z","lastTransitionTime":"2026-03-19T15:17:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.001028 4771 generic.go:334] "Generic (PLEG): container finished" podID="bf31981b-d437-4216-a275-5b566d8c49aa" containerID="6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd" exitCode=0 Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.001099 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" event={"ID":"bf31981b-d437-4216-a275-5b566d8c49aa","Type":"ContainerDied","Data":"6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd"} Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.003308 4771 generic.go:334] "Generic (PLEG): container finished" podID="7afaaec8-b9d9-4b61-8bd2-3517ef7de1db" containerID="5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469" exitCode=0 Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.003360 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" event={"ID":"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db","Type":"ContainerDied","Data":"5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469"} Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.022355 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.041370 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.054488 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.057402 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.057467 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.057478 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.057492 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.057502 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:38Z","lastTransitionTime":"2026-03-19T15:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.066858 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.083867 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.095815 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.107208 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.117804 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.142262 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.158643 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.159956 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.160025 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.160041 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.160063 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.160079 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:38Z","lastTransitionTime":"2026-03-19T15:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.171385 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.183286 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.194218 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.206185 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.218772 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.235560 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.260061 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.267682 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.267739 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.267755 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.267778 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.267821 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:38Z","lastTransitionTime":"2026-03-19T15:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.274282 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.282334 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.290205 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.304077 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.316710 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.331277 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.346667 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.360854 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.369767 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.369804 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.369815 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.369831 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.369843 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:38Z","lastTransitionTime":"2026-03-19T15:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.373915 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.384081 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.395361 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.409235 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.428322 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.471560 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.471585 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.471593 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.471607 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.471616 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:38Z","lastTransitionTime":"2026-03-19T15:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.508234 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:38 crc kubenswrapper[4771]: E0319 15:17:38.508460 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.574246 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.574291 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.574308 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.574331 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.574350 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:38Z","lastTransitionTime":"2026-03-19T15:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.677080 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.677114 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.677122 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.677138 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.677148 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:38Z","lastTransitionTime":"2026-03-19T15:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.780317 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.780382 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.780402 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.780426 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.780444 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:38Z","lastTransitionTime":"2026-03-19T15:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.882599 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.882649 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.882689 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.882710 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.882724 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:38Z","lastTransitionTime":"2026-03-19T15:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.986204 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.986245 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.986258 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.986277 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:38 crc kubenswrapper[4771]: I0319 15:17:38.986305 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:38Z","lastTransitionTime":"2026-03-19T15:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.011398 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" event={"ID":"bf31981b-d437-4216-a275-5b566d8c49aa","Type":"ContainerStarted","Data":"113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000"} Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.011462 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" event={"ID":"bf31981b-d437-4216-a275-5b566d8c49aa","Type":"ContainerStarted","Data":"1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727"} Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.011491 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" event={"ID":"bf31981b-d437-4216-a275-5b566d8c49aa","Type":"ContainerStarted","Data":"34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702"} Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.011516 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" event={"ID":"bf31981b-d437-4216-a275-5b566d8c49aa","Type":"ContainerStarted","Data":"26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6"} Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.011533 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" event={"ID":"bf31981b-d437-4216-a275-5b566d8c49aa","Type":"ContainerStarted","Data":"3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c"} Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.011553 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" event={"ID":"bf31981b-d437-4216-a275-5b566d8c49aa","Type":"ContainerStarted","Data":"90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2"} Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.014823 4771 generic.go:334] "Generic (PLEG): container finished" podID="7afaaec8-b9d9-4b61-8bd2-3517ef7de1db" containerID="cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435" exitCode=0 Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.014907 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" event={"ID":"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db","Type":"ContainerDied","Data":"cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435"} Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.019835 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" event={"ID":"f2b6e948-bbef-4217-b0eb-4cdbf711037c","Type":"ContainerStarted","Data":"ce98359471e6ebc3c781c53f9143d8aedb0563c958c98591c755d9423ea41d28"} Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.019904 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" event={"ID":"f2b6e948-bbef-4217-b0eb-4cdbf711037c","Type":"ContainerStarted","Data":"505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93"} Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.029637 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.053419 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.075352 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.091075 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.091135 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.091154 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.091180 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.091198 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:39Z","lastTransitionTime":"2026-03-19T15:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.097137 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.111298 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.131443 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.149473 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.163114 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.179458 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.192796 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.194530 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.194576 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.194594 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.194618 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.194635 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:39Z","lastTransitionTime":"2026-03-19T15:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.207246 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.221827 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.234849 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.247422 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.262134 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.279015 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.295507 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.296625 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.296660 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.296671 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.296687 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.296699 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:39Z","lastTransitionTime":"2026-03-19T15:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.302396 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs\") pod \"network-metrics-daemon-zjhnk\" (UID: \"7fb3bb21-b72b-45e1-9b87-73f281abba90\") " pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:17:39 crc kubenswrapper[4771]: E0319 15:17:39.302600 4771 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 15:17:39 crc kubenswrapper[4771]: E0319 15:17:39.302669 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs podName:7fb3bb21-b72b-45e1-9b87-73f281abba90 nodeName:}" failed. No retries permitted until 2026-03-19 15:17:43.30264983 +0000 UTC m=+122.531271042 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs") pod "network-metrics-daemon-zjhnk" (UID: "7fb3bb21-b72b-45e1-9b87-73f281abba90") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.307533 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.317503 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.327480 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.337074 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.347086 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce98359471e6ebc3c781c53f9143d8aedb0563c958c98591c755d9423ea41d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.360140 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.381388 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.391983 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.399287 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.399330 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.399342 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.399361 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.399376 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:39Z","lastTransitionTime":"2026-03-19T15:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.403755 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.415271 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.426849 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.440996 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.454865 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.502371 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.502405 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.502414 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.502427 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.502436 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:39Z","lastTransitionTime":"2026-03-19T15:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.508136 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.508179 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.508228 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:17:39 crc kubenswrapper[4771]: E0319 15:17:39.508324 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:17:39 crc kubenswrapper[4771]: E0319 15:17:39.508403 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:17:39 crc kubenswrapper[4771]: E0319 15:17:39.508572 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.575070 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.575112 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.575122 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.575137 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.575148 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:39Z","lastTransitionTime":"2026-03-19T15:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:39 crc kubenswrapper[4771]: E0319 15:17:39.597074 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.600841 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.600903 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.600920 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.600946 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.600962 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:39Z","lastTransitionTime":"2026-03-19T15:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:39 crc kubenswrapper[4771]: E0319 15:17:39.618572 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.622769 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.622828 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.622847 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.622872 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.622890 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:39Z","lastTransitionTime":"2026-03-19T15:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:39 crc kubenswrapper[4771]: E0319 15:17:39.653850 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.662425 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.662489 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.662510 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.662535 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.662552 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:39Z","lastTransitionTime":"2026-03-19T15:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:39 crc kubenswrapper[4771]: E0319 15:17:39.688128 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.691912 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.691940 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.691948 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.691960 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.691968 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:39Z","lastTransitionTime":"2026-03-19T15:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:39 crc kubenswrapper[4771]: E0319 15:17:39.705452 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:39Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:39 crc kubenswrapper[4771]: E0319 15:17:39.705577 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.707309 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.707429 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.707524 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.707621 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.707707 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:39Z","lastTransitionTime":"2026-03-19T15:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.810922 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.811282 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.811638 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.811836 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.812037 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:39Z","lastTransitionTime":"2026-03-19T15:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.916264 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.916576 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.916733 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.916920 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:39 crc kubenswrapper[4771]: I0319 15:17:39.917091 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:39Z","lastTransitionTime":"2026-03-19T15:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.024122 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.024350 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.024443 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.024514 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.024584 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:40Z","lastTransitionTime":"2026-03-19T15:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.026403 4771 generic.go:334] "Generic (PLEG): container finished" podID="7afaaec8-b9d9-4b61-8bd2-3517ef7de1db" containerID="c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa" exitCode=0 Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.026445 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" event={"ID":"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db","Type":"ContainerDied","Data":"c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa"} Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.041604 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:40Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.053279 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:40Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.064446 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:40Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.074204 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:40Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.086808 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:40Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.097381 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce98359471e6ebc3c781c53f9143d8aedb0563c958c98591c755d9423ea41d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:40Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.108966 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:40Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.126154 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:40Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.128114 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.128140 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.128148 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.128161 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.128169 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:40Z","lastTransitionTime":"2026-03-19T15:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.135175 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:40Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.146256 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:40Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.158064 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:40Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.172942 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:40Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.186104 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:40Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.206075 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:40Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.220396 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:40Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.230934 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.230968 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.230978 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.231008 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.231018 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:40Z","lastTransitionTime":"2026-03-19T15:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.333052 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.333079 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.333087 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.333099 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.333107 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:40Z","lastTransitionTime":"2026-03-19T15:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.435435 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.435497 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.435519 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.435548 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.435569 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:40Z","lastTransitionTime":"2026-03-19T15:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.508314 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:40 crc kubenswrapper[4771]: E0319 15:17:40.508535 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.539257 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.539294 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.539303 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.539317 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.539327 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:40Z","lastTransitionTime":"2026-03-19T15:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.642138 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.642161 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.642169 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.642181 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.642190 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:40Z","lastTransitionTime":"2026-03-19T15:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.743940 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.744026 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.744050 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.744076 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.744093 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:40Z","lastTransitionTime":"2026-03-19T15:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.846744 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.846802 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.846822 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.846852 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.846873 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:40Z","lastTransitionTime":"2026-03-19T15:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.949670 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.949711 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.949722 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.949736 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:40 crc kubenswrapper[4771]: I0319 15:17:40.949745 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:40Z","lastTransitionTime":"2026-03-19T15:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.034879 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" event={"ID":"bf31981b-d437-4216-a275-5b566d8c49aa","Type":"ContainerStarted","Data":"8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1"} Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.039605 4771 generic.go:334] "Generic (PLEG): container finished" podID="7afaaec8-b9d9-4b61-8bd2-3517ef7de1db" containerID="bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8" exitCode=0 Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.039649 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" event={"ID":"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db","Type":"ContainerDied","Data":"bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8"} Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.054543 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce98359471e6ebc3c781c53f9143d8aedb0563c958c98591c755d9423ea41d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.065893 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.065939 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.065952 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.065971 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.065983 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:41Z","lastTransitionTime":"2026-03-19T15:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.079345 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.103879 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.116692 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.128292 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.143706 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.158034 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.168661 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.168715 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.168735 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.168756 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.168770 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:41Z","lastTransitionTime":"2026-03-19T15:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.174305 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.190272 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.205114 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.225039 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.242225 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.259501 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.271278 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.271339 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.271363 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.271391 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.271413 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:41Z","lastTransitionTime":"2026-03-19T15:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.273768 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.291057 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.373524 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.373889 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.373915 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.373943 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.373964 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:41Z","lastTransitionTime":"2026-03-19T15:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:41 crc kubenswrapper[4771]: E0319 15:17:41.474466 4771 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.509217 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:41 crc kubenswrapper[4771]: E0319 15:17:41.509313 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.509804 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:41 crc kubenswrapper[4771]: E0319 15:17:41.509887 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.509959 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:17:41 crc kubenswrapper[4771]: E0319 15:17:41.510041 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.525155 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.545309 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.559926 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.571043 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.588915 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.616729 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:41 crc kubenswrapper[4771]: E0319 15:17:41.619270 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.628368 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.639809 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.652061 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.662314 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.672389 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.684118 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce98359471e6ebc3c781c53f9143d8aedb0563c958c98591c755d9423ea41d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.698152 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.724691 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:41 crc kubenswrapper[4771]: I0319 15:17:41.735711 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:42 crc kubenswrapper[4771]: I0319 15:17:42.047843 4771 generic.go:334] "Generic (PLEG): container finished" podID="7afaaec8-b9d9-4b61-8bd2-3517ef7de1db" containerID="e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c" exitCode=0 Mar 19 15:17:42 crc kubenswrapper[4771]: I0319 15:17:42.047884 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" event={"ID":"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db","Type":"ContainerDied","Data":"e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c"} Mar 19 15:17:42 crc kubenswrapper[4771]: I0319 15:17:42.064396 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:42Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:42 crc kubenswrapper[4771]: I0319 15:17:42.089166 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:42Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:42 crc kubenswrapper[4771]: I0319 15:17:42.112619 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:42Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:42 crc kubenswrapper[4771]: I0319 15:17:42.133168 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:42Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:42 crc kubenswrapper[4771]: I0319 15:17:42.160366 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:42Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:42 crc kubenswrapper[4771]: I0319 15:17:42.178496 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:42Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:42 crc kubenswrapper[4771]: I0319 15:17:42.194658 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:42Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:42 crc kubenswrapper[4771]: I0319 15:17:42.208977 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:42Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:42 crc kubenswrapper[4771]: I0319 15:17:42.223525 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:42Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:42 crc kubenswrapper[4771]: I0319 15:17:42.236341 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:42Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:42 crc kubenswrapper[4771]: I0319 15:17:42.246436 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:42Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:42 crc kubenswrapper[4771]: I0319 15:17:42.257341 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce98359471e6ebc3c781c53f9143d8aedb0563c958c98591c755d9423ea41d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:42Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:42 crc kubenswrapper[4771]: I0319 15:17:42.271332 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:42Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:42 crc kubenswrapper[4771]: I0319 15:17:42.290294 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:42Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:42 crc kubenswrapper[4771]: I0319 15:17:42.301673 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:42Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:42 crc kubenswrapper[4771]: I0319 15:17:42.508266 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:42 crc kubenswrapper[4771]: E0319 15:17:42.508445 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:17:43 crc kubenswrapper[4771]: I0319 15:17:43.057337 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" event={"ID":"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db","Type":"ContainerStarted","Data":"0d1b78b7b716ebb0981f773174244d9a9583d22d06c13ee70c3057304e406e98"} Mar 19 15:17:43 crc kubenswrapper[4771]: I0319 15:17:43.077385 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:43Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:43 crc kubenswrapper[4771]: I0319 15:17:43.095073 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d1b78b7b716ebb0981f773174244d9a9583d22d06c13ee70c3057304e406e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:43Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:43 crc kubenswrapper[4771]: I0319 15:17:43.111394 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:43Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:43 crc kubenswrapper[4771]: I0319 15:17:43.126398 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:43Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:43 crc kubenswrapper[4771]: I0319 15:17:43.144768 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:43Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:43 crc kubenswrapper[4771]: I0319 15:17:43.156702 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:43Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:43 crc kubenswrapper[4771]: I0319 15:17:43.174160 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:43Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:43 crc kubenswrapper[4771]: I0319 15:17:43.190605 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:43Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:43 crc kubenswrapper[4771]: I0319 15:17:43.210975 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:43Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:43 crc kubenswrapper[4771]: I0319 15:17:43.224816 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:43Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:43 crc kubenswrapper[4771]: I0319 15:17:43.240074 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:43Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:43 crc kubenswrapper[4771]: I0319 15:17:43.254622 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:43Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:43 crc kubenswrapper[4771]: I0319 15:17:43.269626 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce98359471e6ebc3c781c53f9143d8aedb0563c958c98591c755d9423ea41d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:43Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:43 crc kubenswrapper[4771]: I0319 15:17:43.302087 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:43Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:43 crc kubenswrapper[4771]: I0319 15:17:43.325945 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:43Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:43 crc kubenswrapper[4771]: I0319 15:17:43.345477 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs\") pod \"network-metrics-daemon-zjhnk\" (UID: \"7fb3bb21-b72b-45e1-9b87-73f281abba90\") " pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:17:43 crc kubenswrapper[4771]: E0319 15:17:43.345803 4771 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 15:17:43 crc kubenswrapper[4771]: E0319 15:17:43.346358 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs podName:7fb3bb21-b72b-45e1-9b87-73f281abba90 nodeName:}" failed. No retries permitted until 2026-03-19 15:17:51.346335385 +0000 UTC m=+130.574956587 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs") pod "network-metrics-daemon-zjhnk" (UID: "7fb3bb21-b72b-45e1-9b87-73f281abba90") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 15:17:43 crc kubenswrapper[4771]: I0319 15:17:43.508262 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:43 crc kubenswrapper[4771]: I0319 15:17:43.508335 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:17:43 crc kubenswrapper[4771]: I0319 15:17:43.508268 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:43 crc kubenswrapper[4771]: E0319 15:17:43.508433 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:17:43 crc kubenswrapper[4771]: E0319 15:17:43.508503 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:17:43 crc kubenswrapper[4771]: E0319 15:17:43.508555 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.066421 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" event={"ID":"bf31981b-d437-4216-a275-5b566d8c49aa","Type":"ContainerStarted","Data":"28c9e671b26af917eeaf5cad0f71c120dfaceeef13fb7656d9da78a4425ab6c9"} Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.066961 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.067000 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.067011 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.085337 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.095320 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.101031 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.101631 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.115716 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.132538 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.145067 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.164282 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c9e671b26af917eeaf5cad0f71c120dfaceeef13fb7656d9da78a4425ab6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.177764 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.192572 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.207532 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce98359471e6ebc3c781c53f9143d8aedb0563c958c98591c755d9423ea41d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.225581 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.242758 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.256200 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.272657 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d1b78b7b716ebb0981f773174244d9a9583d22d06c13ee70c3057304e406e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.286324 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.299179 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.315957 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.327642 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.343063 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.358222 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.369846 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.388396 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c9e671b26af917eeaf5cad0f71c120dfaceeef13fb7656d9da78a4425ab6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.398208 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.407594 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.417408 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce98359471e6ebc3c781c53f9143d8aedb0563c958c98591c755d9423ea41d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.427686 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.443566 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.460441 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.479667 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d1b78b7b716ebb0981f773174244d9a9583d22d06c13ee70c3057304e406e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.497249 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.508042 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:44 crc kubenswrapper[4771]: E0319 15:17:44.508170 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:17:44 crc kubenswrapper[4771]: I0319 15:17:44.509386 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:44Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:45 crc kubenswrapper[4771]: I0319 15:17:45.508283 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:17:45 crc kubenswrapper[4771]: I0319 15:17:45.508450 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:45 crc kubenswrapper[4771]: I0319 15:17:45.508511 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:45 crc kubenswrapper[4771]: E0319 15:17:45.508765 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:17:45 crc kubenswrapper[4771]: E0319 15:17:45.508859 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:17:45 crc kubenswrapper[4771]: E0319 15:17:45.508903 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.074628 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b6zx4_bf31981b-d437-4216-a275-5b566d8c49aa/ovnkube-controller/0.log" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.077712 4771 generic.go:334] "Generic (PLEG): container finished" podID="bf31981b-d437-4216-a275-5b566d8c49aa" containerID="28c9e671b26af917eeaf5cad0f71c120dfaceeef13fb7656d9da78a4425ab6c9" exitCode=1 Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.077768 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" event={"ID":"bf31981b-d437-4216-a275-5b566d8c49aa","Type":"ContainerDied","Data":"28c9e671b26af917eeaf5cad0f71c120dfaceeef13fb7656d9da78a4425ab6c9"} Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.078708 4771 scope.go:117] "RemoveContainer" containerID="28c9e671b26af917eeaf5cad0f71c120dfaceeef13fb7656d9da78a4425ab6c9" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.094927 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.107777 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.122935 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.142314 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d1b78b7b716ebb0981f773174244d9a9583d22d06c13ee70c3057304e406e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.160516 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.177560 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.193033 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.205738 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.224741 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.242413 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.257395 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.275726 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c9e671b26af917eeaf5cad0f71c120dfaceeef13fb7656d9da78a4425ab6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28c9e671b26af917eeaf5cad0f71c120dfaceeef13fb7656d9da78a4425ab6c9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:17:45Z\\\",\\\"message\\\":\\\"6 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 15:17:45.666466 6686 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 15:17:45.667065 6686 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0319 15:17:45.667134 6686 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0319 15:17:45.667184 6686 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0319 15:17:45.667181 6686 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0319 15:17:45.667209 6686 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0319 15:17:45.667218 6686 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0319 15:17:45.667251 6686 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 15:17:45.667278 6686 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0319 15:17:45.667289 6686 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0319 15:17:45.667305 6686 factory.go:656] Stopping watch factory\\\\nI0319 15:17:45.667317 6686 ovnkube.go:599] Stopped ovnkube\\\\nI0319 15:17:45.667337 6686 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0319 15:17:45.667373 6686 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.286210 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.298286 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.310447 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce98359471e6ebc3c781c53f9143d8aedb0563c958c98591c755d9423ea41d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.339247 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.365016 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c9e671b26af917eeaf5cad0f71c120dfaceeef13fb7656d9da78a4425ab6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28c9e671b26af917eeaf5cad0f71c120dfaceeef13fb7656d9da78a4425ab6c9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:17:45Z\\\",\\\"message\\\":\\\"6 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 15:17:45.666466 6686 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 15:17:45.667065 6686 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0319 15:17:45.667134 6686 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0319 15:17:45.667184 6686 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0319 15:17:45.667181 6686 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0319 15:17:45.667209 6686 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0319 15:17:45.667218 6686 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0319 15:17:45.667251 6686 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 15:17:45.667278 6686 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0319 15:17:45.667289 6686 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0319 15:17:45.667305 6686 factory.go:656] Stopping watch factory\\\\nI0319 15:17:45.667317 6686 ovnkube.go:599] Stopped ovnkube\\\\nI0319 15:17:45.667337 6686 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0319 15:17:45.667373 6686 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.375167 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.384294 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.395366 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce98359471e6ebc3c781c53f9143d8aedb0563c958c98591c755d9423ea41d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.408541 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.420893 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.433712 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.450135 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d1b78b7b716ebb0981f773174244d9a9583d22d06c13ee70c3057304e406e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.465450 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.483333 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.496940 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.508591 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:46 crc kubenswrapper[4771]: E0319 15:17:46.508693 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.513554 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.531702 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.545620 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: I0319 15:17:46.557112 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:46 crc kubenswrapper[4771]: E0319 15:17:46.620319 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 15:17:47 crc kubenswrapper[4771]: I0319 15:17:47.083805 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b6zx4_bf31981b-d437-4216-a275-5b566d8c49aa/ovnkube-controller/1.log" Mar 19 15:17:47 crc kubenswrapper[4771]: I0319 15:17:47.084603 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b6zx4_bf31981b-d437-4216-a275-5b566d8c49aa/ovnkube-controller/0.log" Mar 19 15:17:47 crc kubenswrapper[4771]: I0319 15:17:47.088161 4771 generic.go:334] "Generic (PLEG): container finished" podID="bf31981b-d437-4216-a275-5b566d8c49aa" containerID="74c85ac463840433cb0a593c80a6d077df03146725d4c97343da426ba9bb5034" exitCode=1 Mar 19 15:17:47 crc kubenswrapper[4771]: I0319 15:17:47.088216 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" event={"ID":"bf31981b-d437-4216-a275-5b566d8c49aa","Type":"ContainerDied","Data":"74c85ac463840433cb0a593c80a6d077df03146725d4c97343da426ba9bb5034"} Mar 19 15:17:47 crc kubenswrapper[4771]: I0319 15:17:47.088262 4771 scope.go:117] "RemoveContainer" containerID="28c9e671b26af917eeaf5cad0f71c120dfaceeef13fb7656d9da78a4425ab6c9" Mar 19 15:17:47 crc kubenswrapper[4771]: I0319 15:17:47.089763 4771 scope.go:117] "RemoveContainer" containerID="74c85ac463840433cb0a593c80a6d077df03146725d4c97343da426ba9bb5034" Mar 19 15:17:47 crc kubenswrapper[4771]: E0319 15:17:47.090173 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-b6zx4_openshift-ovn-kubernetes(bf31981b-d437-4216-a275-5b566d8c49aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" Mar 19 15:17:47 crc kubenswrapper[4771]: I0319 15:17:47.123700 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74c85ac463840433cb0a593c80a6d077df03146725d4c97343da426ba9bb5034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28c9e671b26af917eeaf5cad0f71c120dfaceeef13fb7656d9da78a4425ab6c9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:17:45Z\\\",\\\"message\\\":\\\"6 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 15:17:45.666466 6686 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 15:17:45.667065 6686 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0319 15:17:45.667134 6686 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0319 15:17:45.667184 6686 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0319 15:17:45.667181 6686 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0319 15:17:45.667209 6686 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0319 15:17:45.667218 6686 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0319 15:17:45.667251 6686 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 15:17:45.667278 6686 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0319 15:17:45.667289 6686 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0319 15:17:45.667305 6686 factory.go:656] Stopping watch factory\\\\nI0319 15:17:45.667317 6686 ovnkube.go:599] Stopped ovnkube\\\\nI0319 15:17:45.667337 6686 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0319 15:17:45.667373 6686 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74c85ac463840433cb0a593c80a6d077df03146725d4c97343da426ba9bb5034\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:17:47Z\\\",\\\"message\\\":\\\"iled to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z]\\\\nI0319 15:17:47.001050 6818 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/network-metrics-daemon-zjhnk\\\\nI0319 15:17:47.001067 6818 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/network-metrics-daemon-zjhnk\\\\nI0319 15:17:47.001022 6818 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator-webhook]} name:Service_openshift-machine-api/machine-api-operator-webhook_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_respon\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:47Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:47 crc kubenswrapper[4771]: I0319 15:17:47.139443 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:47Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:47 crc kubenswrapper[4771]: I0319 15:17:47.155315 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:47Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:47 crc kubenswrapper[4771]: I0319 15:17:47.171307 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce98359471e6ebc3c781c53f9143d8aedb0563c958c98591c755d9423ea41d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:47Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:47 crc kubenswrapper[4771]: I0319 15:17:47.190648 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:47Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:47 crc kubenswrapper[4771]: I0319 15:17:47.209862 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:47Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:47 crc kubenswrapper[4771]: I0319 15:17:47.231078 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:47Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:47 crc kubenswrapper[4771]: I0319 15:17:47.248896 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d1b78b7b716ebb0981f773174244d9a9583d22d06c13ee70c3057304e406e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:47Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:47 crc kubenswrapper[4771]: I0319 15:17:47.270230 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:47Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:47 crc kubenswrapper[4771]: I0319 15:17:47.291287 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:47Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:47 crc kubenswrapper[4771]: I0319 15:17:47.306273 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:17:47 crc kubenswrapper[4771]: I0319 15:17:47.306435 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:47 crc kubenswrapper[4771]: E0319 15:17:47.306514 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:18:19.306483554 +0000 UTC m=+158.535104786 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:17:47 crc kubenswrapper[4771]: E0319 15:17:47.306525 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 15:17:47 crc kubenswrapper[4771]: E0319 15:17:47.306593 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 15:18:19.306576287 +0000 UTC m=+158.535197519 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 15:17:47 crc kubenswrapper[4771]: I0319 15:17:47.306670 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:47 crc kubenswrapper[4771]: E0319 15:17:47.306851 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 15:17:47 crc kubenswrapper[4771]: E0319 15:17:47.306973 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 15:18:19.306946316 +0000 UTC m=+158.535567548 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 15:17:47 crc kubenswrapper[4771]: I0319 15:17:47.307545 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:47Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:47 crc kubenswrapper[4771]: I0319 15:17:47.323705 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:47Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:47 crc kubenswrapper[4771]: I0319 15:17:47.343367 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:47Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:47 crc kubenswrapper[4771]: I0319 15:17:47.362216 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:47Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:47 crc kubenswrapper[4771]: I0319 15:17:47.382063 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:47Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:47 crc kubenswrapper[4771]: I0319 15:17:47.407858 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:47 crc kubenswrapper[4771]: I0319 15:17:47.407912 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:47 crc kubenswrapper[4771]: E0319 15:17:47.408097 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 15:17:47 crc kubenswrapper[4771]: E0319 15:17:47.408137 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 15:17:47 crc kubenswrapper[4771]: E0319 15:17:47.408146 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 15:17:47 crc kubenswrapper[4771]: E0319 15:17:47.408164 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 15:17:47 crc kubenswrapper[4771]: E0319 15:17:47.408172 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:17:47 crc kubenswrapper[4771]: E0319 15:17:47.408183 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:17:47 crc kubenswrapper[4771]: E0319 15:17:47.408287 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 15:18:19.408263607 +0000 UTC m=+158.636884850 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:17:47 crc kubenswrapper[4771]: E0319 15:17:47.408317 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 15:18:19.408305429 +0000 UTC m=+158.636926661 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:17:47 crc kubenswrapper[4771]: I0319 15:17:47.507944 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:47 crc kubenswrapper[4771]: I0319 15:17:47.507978 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:47 crc kubenswrapper[4771]: E0319 15:17:47.508145 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:17:47 crc kubenswrapper[4771]: E0319 15:17:47.508291 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:17:47 crc kubenswrapper[4771]: I0319 15:17:47.508366 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:17:47 crc kubenswrapper[4771]: E0319 15:17:47.508554 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:17:48 crc kubenswrapper[4771]: I0319 15:17:48.095410 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b6zx4_bf31981b-d437-4216-a275-5b566d8c49aa/ovnkube-controller/1.log" Mar 19 15:17:48 crc kubenswrapper[4771]: I0319 15:17:48.100032 4771 scope.go:117] "RemoveContainer" containerID="74c85ac463840433cb0a593c80a6d077df03146725d4c97343da426ba9bb5034" Mar 19 15:17:48 crc kubenswrapper[4771]: E0319 15:17:48.100203 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-b6zx4_openshift-ovn-kubernetes(bf31981b-d437-4216-a275-5b566d8c49aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" Mar 19 15:17:48 crc kubenswrapper[4771]: I0319 15:17:48.120000 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce98359471e6ebc3c781c53f9143d8aedb0563c958c98591c755d9423ea41d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:48Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:48 crc kubenswrapper[4771]: I0319 15:17:48.141903 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:48Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:48 crc kubenswrapper[4771]: I0319 15:17:48.173044 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74c85ac463840433cb0a593c80a6d077df03146725d4c97343da426ba9bb5034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74c85ac463840433cb0a593c80a6d077df03146725d4c97343da426ba9bb5034\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:17:47Z\\\",\\\"message\\\":\\\"iled to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z]\\\\nI0319 15:17:47.001050 6818 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/network-metrics-daemon-zjhnk\\\\nI0319 15:17:47.001067 6818 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/network-metrics-daemon-zjhnk\\\\nI0319 15:17:47.001022 6818 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator-webhook]} name:Service_openshift-machine-api/machine-api-operator-webhook_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_respon\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-b6zx4_openshift-ovn-kubernetes(bf31981b-d437-4216-a275-5b566d8c49aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:48Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:48 crc kubenswrapper[4771]: I0319 15:17:48.185438 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:48Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:48 crc kubenswrapper[4771]: I0319 15:17:48.198165 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:48Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:48 crc kubenswrapper[4771]: I0319 15:17:48.219940 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:48Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:48 crc kubenswrapper[4771]: I0319 15:17:48.242936 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:48Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:48 crc kubenswrapper[4771]: I0319 15:17:48.259785 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:48Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:48 crc kubenswrapper[4771]: I0319 15:17:48.284518 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d1b78b7b716ebb0981f773174244d9a9583d22d06c13ee70c3057304e406e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:48Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:48 crc kubenswrapper[4771]: I0319 15:17:48.332334 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:48Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:48 crc kubenswrapper[4771]: I0319 15:17:48.365373 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:48Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:48 crc kubenswrapper[4771]: I0319 15:17:48.378100 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:48Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:48 crc kubenswrapper[4771]: I0319 15:17:48.390523 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:48Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:48 crc kubenswrapper[4771]: I0319 15:17:48.401335 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:48Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:48 crc kubenswrapper[4771]: I0319 15:17:48.414693 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:48Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:48 crc kubenswrapper[4771]: I0319 15:17:48.508199 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:48 crc kubenswrapper[4771]: E0319 15:17:48.508664 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:17:49 crc kubenswrapper[4771]: I0319 15:17:49.508193 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:49 crc kubenswrapper[4771]: E0319 15:17:49.508386 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:17:49 crc kubenswrapper[4771]: I0319 15:17:49.509210 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:49 crc kubenswrapper[4771]: I0319 15:17:49.509306 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:17:49 crc kubenswrapper[4771]: E0319 15:17:49.509476 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:17:49 crc kubenswrapper[4771]: E0319 15:17:49.509614 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:17:49 crc kubenswrapper[4771]: I0319 15:17:49.797130 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:49 crc kubenswrapper[4771]: I0319 15:17:49.797175 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:49 crc kubenswrapper[4771]: I0319 15:17:49.797186 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:49 crc kubenswrapper[4771]: I0319 15:17:49.797205 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:49 crc kubenswrapper[4771]: I0319 15:17:49.797217 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:49Z","lastTransitionTime":"2026-03-19T15:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:49 crc kubenswrapper[4771]: E0319 15:17:49.812229 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:49Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:49 crc kubenswrapper[4771]: I0319 15:17:49.817291 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:49 crc kubenswrapper[4771]: I0319 15:17:49.817334 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:49 crc kubenswrapper[4771]: I0319 15:17:49.817342 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:49 crc kubenswrapper[4771]: I0319 15:17:49.817356 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:49 crc kubenswrapper[4771]: I0319 15:17:49.817365 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:49Z","lastTransitionTime":"2026-03-19T15:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:49 crc kubenswrapper[4771]: E0319 15:17:49.833605 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:49Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:49 crc kubenswrapper[4771]: I0319 15:17:49.837574 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:49 crc kubenswrapper[4771]: I0319 15:17:49.837639 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:49 crc kubenswrapper[4771]: I0319 15:17:49.837659 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:49 crc kubenswrapper[4771]: I0319 15:17:49.837685 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:49 crc kubenswrapper[4771]: I0319 15:17:49.837702 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:49Z","lastTransitionTime":"2026-03-19T15:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:49 crc kubenswrapper[4771]: E0319 15:17:49.853673 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:49Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:49 crc kubenswrapper[4771]: I0319 15:17:49.858711 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:49 crc kubenswrapper[4771]: I0319 15:17:49.858978 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:49 crc kubenswrapper[4771]: I0319 15:17:49.859205 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:49 crc kubenswrapper[4771]: I0319 15:17:49.859385 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:49 crc kubenswrapper[4771]: I0319 15:17:49.859525 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:49Z","lastTransitionTime":"2026-03-19T15:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:49 crc kubenswrapper[4771]: E0319 15:17:49.876743 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:49Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:49 crc kubenswrapper[4771]: I0319 15:17:49.881873 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:17:49 crc kubenswrapper[4771]: I0319 15:17:49.881924 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:17:49 crc kubenswrapper[4771]: I0319 15:17:49.881941 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:17:49 crc kubenswrapper[4771]: I0319 15:17:49.881961 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:17:49 crc kubenswrapper[4771]: I0319 15:17:49.881976 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:17:49Z","lastTransitionTime":"2026-03-19T15:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:17:49 crc kubenswrapper[4771]: E0319 15:17:49.897927 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:49Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:49 crc kubenswrapper[4771]: E0319 15:17:49.898209 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 15:17:50 crc kubenswrapper[4771]: I0319 15:17:50.508514 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:50 crc kubenswrapper[4771]: E0319 15:17:50.509287 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:17:51 crc kubenswrapper[4771]: I0319 15:17:51.347622 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs\") pod \"network-metrics-daemon-zjhnk\" (UID: \"7fb3bb21-b72b-45e1-9b87-73f281abba90\") " pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:17:51 crc kubenswrapper[4771]: E0319 15:17:51.347820 4771 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 15:17:51 crc kubenswrapper[4771]: E0319 15:17:51.347888 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs podName:7fb3bb21-b72b-45e1-9b87-73f281abba90 nodeName:}" failed. No retries permitted until 2026-03-19 15:18:07.347866098 +0000 UTC m=+146.576487340 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs") pod "network-metrics-daemon-zjhnk" (UID: "7fb3bb21-b72b-45e1-9b87-73f281abba90") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 15:17:51 crc kubenswrapper[4771]: I0319 15:17:51.507653 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:51 crc kubenswrapper[4771]: I0319 15:17:51.507732 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:51 crc kubenswrapper[4771]: E0319 15:17:51.507874 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:17:51 crc kubenswrapper[4771]: I0319 15:17:51.507962 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:17:51 crc kubenswrapper[4771]: E0319 15:17:51.508224 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:17:51 crc kubenswrapper[4771]: E0319 15:17:51.508263 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:17:51 crc kubenswrapper[4771]: I0319 15:17:51.517421 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 19 15:17:51 crc kubenswrapper[4771]: I0319 15:17:51.520892 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:51 crc kubenswrapper[4771]: I0319 15:17:51.534937 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:51 crc kubenswrapper[4771]: I0319 15:17:51.548229 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:51 crc kubenswrapper[4771]: I0319 15:17:51.564795 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d1b78b7b716ebb0981f773174244d9a9583d22d06c13ee70c3057304e406e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:51 crc kubenswrapper[4771]: I0319 15:17:51.580185 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:51 crc kubenswrapper[4771]: I0319 15:17:51.594706 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:51 crc kubenswrapper[4771]: I0319 15:17:51.608067 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:51 crc kubenswrapper[4771]: I0319 15:17:51.620295 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:51 crc kubenswrapper[4771]: E0319 15:17:51.620838 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 15:17:51 crc kubenswrapper[4771]: I0319 15:17:51.632633 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:51 crc kubenswrapper[4771]: I0319 15:17:51.645652 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:51 crc kubenswrapper[4771]: I0319 15:17:51.655866 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce98359471e6ebc3c781c53f9143d8aedb0563c958c98591c755d9423ea41d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:51 crc kubenswrapper[4771]: I0319 15:17:51.667780 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:51 crc kubenswrapper[4771]: I0319 15:17:51.697834 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74c85ac463840433cb0a593c80a6d077df03146725d4c97343da426ba9bb5034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74c85ac463840433cb0a593c80a6d077df03146725d4c97343da426ba9bb5034\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:17:47Z\\\",\\\"message\\\":\\\"iled to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z]\\\\nI0319 15:17:47.001050 6818 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/network-metrics-daemon-zjhnk\\\\nI0319 15:17:47.001067 6818 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/network-metrics-daemon-zjhnk\\\\nI0319 15:17:47.001022 6818 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator-webhook]} name:Service_openshift-machine-api/machine-api-operator-webhook_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_respon\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-b6zx4_openshift-ovn-kubernetes(bf31981b-d437-4216-a275-5b566d8c49aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:51 crc kubenswrapper[4771]: I0319 15:17:51.707052 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:51 crc kubenswrapper[4771]: I0319 15:17:51.720778 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:17:52 crc kubenswrapper[4771]: I0319 15:17:52.507841 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:52 crc kubenswrapper[4771]: E0319 15:17:52.507955 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:17:53 crc kubenswrapper[4771]: I0319 15:17:53.388792 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:17:53 crc kubenswrapper[4771]: I0319 15:17:53.391109 4771 scope.go:117] "RemoveContainer" containerID="74c85ac463840433cb0a593c80a6d077df03146725d4c97343da426ba9bb5034" Mar 19 15:17:53 crc kubenswrapper[4771]: E0319 15:17:53.391481 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-b6zx4_openshift-ovn-kubernetes(bf31981b-d437-4216-a275-5b566d8c49aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" Mar 19 15:17:53 crc kubenswrapper[4771]: I0319 15:17:53.508155 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:53 crc kubenswrapper[4771]: I0319 15:17:53.508180 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:53 crc kubenswrapper[4771]: I0319 15:17:53.508180 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:17:53 crc kubenswrapper[4771]: E0319 15:17:53.508462 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:17:53 crc kubenswrapper[4771]: E0319 15:17:53.508299 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:17:53 crc kubenswrapper[4771]: E0319 15:17:53.508503 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:17:54 crc kubenswrapper[4771]: I0319 15:17:54.508051 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:54 crc kubenswrapper[4771]: E0319 15:17:54.508174 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:17:55 crc kubenswrapper[4771]: I0319 15:17:55.508126 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:55 crc kubenswrapper[4771]: I0319 15:17:55.508159 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:17:55 crc kubenswrapper[4771]: E0319 15:17:55.508266 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:17:55 crc kubenswrapper[4771]: E0319 15:17:55.508427 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:17:55 crc kubenswrapper[4771]: I0319 15:17:55.508611 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:55 crc kubenswrapper[4771]: E0319 15:17:55.508888 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:17:56 crc kubenswrapper[4771]: I0319 15:17:56.508337 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:56 crc kubenswrapper[4771]: E0319 15:17:56.508522 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:17:56 crc kubenswrapper[4771]: E0319 15:17:56.622739 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 15:17:57 crc kubenswrapper[4771]: I0319 15:17:57.508173 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:57 crc kubenswrapper[4771]: I0319 15:17:57.508190 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:57 crc kubenswrapper[4771]: I0319 15:17:57.508235 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:17:57 crc kubenswrapper[4771]: E0319 15:17:57.508300 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:17:57 crc kubenswrapper[4771]: E0319 15:17:57.508368 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:17:57 crc kubenswrapper[4771]: E0319 15:17:57.508432 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:17:58 crc kubenswrapper[4771]: I0319 15:17:58.508537 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:17:58 crc kubenswrapper[4771]: E0319 15:17:58.508751 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:17:59 crc kubenswrapper[4771]: I0319 15:17:59.508449 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:17:59 crc kubenswrapper[4771]: I0319 15:17:59.508505 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:17:59 crc kubenswrapper[4771]: I0319 15:17:59.508550 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:17:59 crc kubenswrapper[4771]: E0319 15:17:59.508629 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:17:59 crc kubenswrapper[4771]: E0319 15:17:59.508707 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:17:59 crc kubenswrapper[4771]: E0319 15:17:59.508770 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:00 crc kubenswrapper[4771]: I0319 15:18:00.178629 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:00 crc kubenswrapper[4771]: I0319 15:18:00.178705 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:00 crc kubenswrapper[4771]: I0319 15:18:00.178728 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:00 crc kubenswrapper[4771]: I0319 15:18:00.178756 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:00 crc kubenswrapper[4771]: I0319 15:18:00.178780 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:00Z","lastTransitionTime":"2026-03-19T15:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:00 crc kubenswrapper[4771]: E0319 15:18:00.199648 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:00Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:00 crc kubenswrapper[4771]: I0319 15:18:00.204402 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:00 crc kubenswrapper[4771]: I0319 15:18:00.204472 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:00 crc kubenswrapper[4771]: I0319 15:18:00.204497 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:00 crc kubenswrapper[4771]: I0319 15:18:00.204526 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:00 crc kubenswrapper[4771]: I0319 15:18:00.204547 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:00Z","lastTransitionTime":"2026-03-19T15:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:00 crc kubenswrapper[4771]: E0319 15:18:00.225797 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:00Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:00 crc kubenswrapper[4771]: I0319 15:18:00.230316 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:00 crc kubenswrapper[4771]: I0319 15:18:00.230376 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:00 crc kubenswrapper[4771]: I0319 15:18:00.230394 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:00 crc kubenswrapper[4771]: I0319 15:18:00.230417 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:00 crc kubenswrapper[4771]: I0319 15:18:00.230435 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:00Z","lastTransitionTime":"2026-03-19T15:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:00 crc kubenswrapper[4771]: E0319 15:18:00.251480 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:00Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:00 crc kubenswrapper[4771]: I0319 15:18:00.259662 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:00 crc kubenswrapper[4771]: I0319 15:18:00.259709 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:00 crc kubenswrapper[4771]: I0319 15:18:00.259741 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:00 crc kubenswrapper[4771]: I0319 15:18:00.259762 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:00 crc kubenswrapper[4771]: I0319 15:18:00.259774 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:00Z","lastTransitionTime":"2026-03-19T15:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:00 crc kubenswrapper[4771]: E0319 15:18:00.279511 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:00Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:00 crc kubenswrapper[4771]: I0319 15:18:00.283919 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:00 crc kubenswrapper[4771]: I0319 15:18:00.283959 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:00 crc kubenswrapper[4771]: I0319 15:18:00.283975 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:00 crc kubenswrapper[4771]: I0319 15:18:00.284101 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:00 crc kubenswrapper[4771]: I0319 15:18:00.284120 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:00Z","lastTransitionTime":"2026-03-19T15:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:00 crc kubenswrapper[4771]: E0319 15:18:00.304131 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:00Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:00 crc kubenswrapper[4771]: E0319 15:18:00.304676 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 15:18:00 crc kubenswrapper[4771]: I0319 15:18:00.508191 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:00 crc kubenswrapper[4771]: E0319 15:18:00.508379 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:01 crc kubenswrapper[4771]: I0319 15:18:01.508194 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:01 crc kubenswrapper[4771]: I0319 15:18:01.508196 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:01 crc kubenswrapper[4771]: E0319 15:18:01.508481 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:18:01 crc kubenswrapper[4771]: I0319 15:18:01.508534 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:01 crc kubenswrapper[4771]: E0319 15:18:01.508673 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:01 crc kubenswrapper[4771]: E0319 15:18:01.508772 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:01 crc kubenswrapper[4771]: I0319 15:18:01.522848 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:01Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:01 crc kubenswrapper[4771]: I0319 15:18:01.540328 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce98359471e6ebc3c781c53f9143d8aedb0563c958c98591c755d9423ea41d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:01Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:01 crc kubenswrapper[4771]: I0319 15:18:01.559437 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:01Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:01 crc kubenswrapper[4771]: I0319 15:18:01.587700 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74c85ac463840433cb0a593c80a6d077df03146725d4c97343da426ba9bb5034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74c85ac463840433cb0a593c80a6d077df03146725d4c97343da426ba9bb5034\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:17:47Z\\\",\\\"message\\\":\\\"iled to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z]\\\\nI0319 15:17:47.001050 6818 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/network-metrics-daemon-zjhnk\\\\nI0319 15:17:47.001067 6818 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/network-metrics-daemon-zjhnk\\\\nI0319 15:17:47.001022 6818 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator-webhook]} name:Service_openshift-machine-api/machine-api-operator-webhook_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_respon\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-b6zx4_openshift-ovn-kubernetes(bf31981b-d437-4216-a275-5b566d8c49aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:01Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:01 crc kubenswrapper[4771]: I0319 15:18:01.602954 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:01Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:01 crc kubenswrapper[4771]: I0319 15:18:01.617445 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:01Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:01 crc kubenswrapper[4771]: E0319 15:18:01.623738 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 15:18:01 crc kubenswrapper[4771]: I0319 15:18:01.633222 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4658283-2a1d-4cda-8827-354317bcc677\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77527adccf33798bec152536c72459bb99ca3e53327a4c748d7544d99d5e7e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://809563b276b55898d5e5824345fbaf17170f5c3ca405104aa5973e91cb1293b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe2d7773b455689fcc7033bd53436cd5e99831482ede8da716f5470038b84250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:01Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:01 crc kubenswrapper[4771]: I0319 15:18:01.646350 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:01Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:01 crc kubenswrapper[4771]: I0319 15:18:01.657276 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:01Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:01 crc kubenswrapper[4771]: I0319 15:18:01.673921 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:01Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:01 crc kubenswrapper[4771]: I0319 15:18:01.693119 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d1b78b7b716ebb0981f773174244d9a9583d22d06c13ee70c3057304e406e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:01Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:01 crc kubenswrapper[4771]: I0319 15:18:01.706360 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:01Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:01 crc kubenswrapper[4771]: I0319 15:18:01.717902 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:01Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:01 crc kubenswrapper[4771]: I0319 15:18:01.731458 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:01Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:01 crc kubenswrapper[4771]: I0319 15:18:01.747829 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:01Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:01 crc kubenswrapper[4771]: I0319 15:18:01.759563 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:01Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:02 crc kubenswrapper[4771]: I0319 15:18:02.508493 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:02 crc kubenswrapper[4771]: E0319 15:18:02.508629 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:03 crc kubenswrapper[4771]: I0319 15:18:03.508051 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:03 crc kubenswrapper[4771]: I0319 15:18:03.508130 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:03 crc kubenswrapper[4771]: I0319 15:18:03.508162 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:03 crc kubenswrapper[4771]: E0319 15:18:03.508634 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:03 crc kubenswrapper[4771]: E0319 15:18:03.509091 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:03 crc kubenswrapper[4771]: E0319 15:18:03.509088 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:18:04 crc kubenswrapper[4771]: I0319 15:18:04.508491 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:04 crc kubenswrapper[4771]: E0319 15:18:04.508635 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:04 crc kubenswrapper[4771]: I0319 15:18:04.519829 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 19 15:18:05 crc kubenswrapper[4771]: I0319 15:18:05.508651 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:05 crc kubenswrapper[4771]: I0319 15:18:05.508728 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:05 crc kubenswrapper[4771]: E0319 15:18:05.508853 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:05 crc kubenswrapper[4771]: I0319 15:18:05.508873 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:05 crc kubenswrapper[4771]: E0319 15:18:05.508960 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:05 crc kubenswrapper[4771]: E0319 15:18:05.509083 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:18:06 crc kubenswrapper[4771]: I0319 15:18:06.507922 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:06 crc kubenswrapper[4771]: E0319 15:18:06.508194 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:06 crc kubenswrapper[4771]: E0319 15:18:06.625365 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 15:18:07 crc kubenswrapper[4771]: I0319 15:18:07.416108 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs\") pod \"network-metrics-daemon-zjhnk\" (UID: \"7fb3bb21-b72b-45e1-9b87-73f281abba90\") " pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:07 crc kubenswrapper[4771]: E0319 15:18:07.416425 4771 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 15:18:07 crc kubenswrapper[4771]: E0319 15:18:07.416592 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs podName:7fb3bb21-b72b-45e1-9b87-73f281abba90 nodeName:}" failed. No retries permitted until 2026-03-19 15:18:39.416563077 +0000 UTC m=+178.645184289 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs") pod "network-metrics-daemon-zjhnk" (UID: "7fb3bb21-b72b-45e1-9b87-73f281abba90") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 15:18:07 crc kubenswrapper[4771]: I0319 15:18:07.508198 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:07 crc kubenswrapper[4771]: I0319 15:18:07.508205 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:07 crc kubenswrapper[4771]: I0319 15:18:07.508625 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:07 crc kubenswrapper[4771]: I0319 15:18:07.508866 4771 scope.go:117] "RemoveContainer" containerID="74c85ac463840433cb0a593c80a6d077df03146725d4c97343da426ba9bb5034" Mar 19 15:18:07 crc kubenswrapper[4771]: E0319 15:18:07.508832 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:07 crc kubenswrapper[4771]: E0319 15:18:07.509428 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:18:07 crc kubenswrapper[4771]: E0319 15:18:07.509510 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:08 crc kubenswrapper[4771]: I0319 15:18:08.162053 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b6zx4_bf31981b-d437-4216-a275-5b566d8c49aa/ovnkube-controller/1.log" Mar 19 15:18:08 crc kubenswrapper[4771]: I0319 15:18:08.165199 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" event={"ID":"bf31981b-d437-4216-a275-5b566d8c49aa","Type":"ContainerStarted","Data":"035bbdec0b1ce9c93570410f2d19b6644ac43f18bda883ba71be3874a485d8ee"} Mar 19 15:18:08 crc kubenswrapper[4771]: I0319 15:18:08.165610 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:18:08 crc kubenswrapper[4771]: I0319 15:18:08.177306 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:08Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:08 crc kubenswrapper[4771]: I0319 15:18:08.186324 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:08Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:08 crc kubenswrapper[4771]: I0319 15:18:08.197629 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:08Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:08 crc kubenswrapper[4771]: I0319 15:18:08.208492 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:08Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:08 crc kubenswrapper[4771]: I0319 15:18:08.220996 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:08Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:08 crc kubenswrapper[4771]: I0319 15:18:08.232839 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:08Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:08 crc kubenswrapper[4771]: I0319 15:18:08.252674 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://035bbdec0b1ce9c93570410f2d19b6644ac43f18bda883ba71be3874a485d8ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74c85ac463840433cb0a593c80a6d077df03146725d4c97343da426ba9bb5034\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:17:47Z\\\",\\\"message\\\":\\\"iled to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z]\\\\nI0319 15:17:47.001050 6818 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/network-metrics-daemon-zjhnk\\\\nI0319 15:17:47.001067 6818 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/network-metrics-daemon-zjhnk\\\\nI0319 15:17:47.001022 6818 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator-webhook]} name:Service_openshift-machine-api/machine-api-operator-webhook_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_respon\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:08Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:08 crc kubenswrapper[4771]: I0319 15:18:08.265734 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:08Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:08 crc kubenswrapper[4771]: I0319 15:18:08.279425 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2ebf37-b8db-4193-bdcb-dd9d10ba0b47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d83b8489925592a1793a8d1fdc5237fc52c2742f581701166f705577dd018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d88f1db3503c5fd2f12fc248d6274a805c322b793f1d6585968b39f5a461610e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:14Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 15:15:43.772545 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 15:15:43.775411 1 observer_polling.go:159] Starting file observer\\\\nI0319 15:15:43.816151 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 15:15:43.819414 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 15:16:14.124542 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:13Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc1650f4cc184b940a4fad0a9d7c1d593ece8735e59aed1c66f4417c2b862e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83324065bc3f8c142d9e97172aa6f22d07dc652071a0ed4365a449510d18b9e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5926799a8fe3e378dd794ebae2a622e7ea61fef68043085471dd9de44e40baa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:08Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:08 crc kubenswrapper[4771]: I0319 15:18:08.291502 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:08Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:08 crc kubenswrapper[4771]: I0319 15:18:08.303933 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce98359471e6ebc3c781c53f9143d8aedb0563c958c98591c755d9423ea41d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:08Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:08 crc kubenswrapper[4771]: I0319 15:18:08.316754 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:08Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:08 crc kubenswrapper[4771]: I0319 15:18:08.328724 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:08Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:08 crc kubenswrapper[4771]: I0319 15:18:08.341115 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:08Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:08 crc kubenswrapper[4771]: I0319 15:18:08.357044 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d1b78b7b716ebb0981f773174244d9a9583d22d06c13ee70c3057304e406e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:08Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:08 crc kubenswrapper[4771]: I0319 15:18:08.372812 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:08Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:08 crc kubenswrapper[4771]: I0319 15:18:08.384898 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4658283-2a1d-4cda-8827-354317bcc677\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77527adccf33798bec152536c72459bb99ca3e53327a4c748d7544d99d5e7e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://809563b276b55898d5e5824345fbaf17170f5c3ca405104aa5973e91cb1293b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe2d7773b455689fcc7033bd53436cd5e99831482ede8da716f5470038b84250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:08Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:08 crc kubenswrapper[4771]: I0319 15:18:08.508061 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:08 crc kubenswrapper[4771]: E0319 15:18:08.508237 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:09 crc kubenswrapper[4771]: I0319 15:18:09.170901 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b6zx4_bf31981b-d437-4216-a275-5b566d8c49aa/ovnkube-controller/2.log" Mar 19 15:18:09 crc kubenswrapper[4771]: I0319 15:18:09.172498 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b6zx4_bf31981b-d437-4216-a275-5b566d8c49aa/ovnkube-controller/1.log" Mar 19 15:18:09 crc kubenswrapper[4771]: I0319 15:18:09.176051 4771 generic.go:334] "Generic (PLEG): container finished" podID="bf31981b-d437-4216-a275-5b566d8c49aa" containerID="035bbdec0b1ce9c93570410f2d19b6644ac43f18bda883ba71be3874a485d8ee" exitCode=1 Mar 19 15:18:09 crc kubenswrapper[4771]: I0319 15:18:09.176124 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" event={"ID":"bf31981b-d437-4216-a275-5b566d8c49aa","Type":"ContainerDied","Data":"035bbdec0b1ce9c93570410f2d19b6644ac43f18bda883ba71be3874a485d8ee"} Mar 19 15:18:09 crc kubenswrapper[4771]: I0319 15:18:09.176187 4771 scope.go:117] "RemoveContainer" containerID="74c85ac463840433cb0a593c80a6d077df03146725d4c97343da426ba9bb5034" Mar 19 15:18:09 crc kubenswrapper[4771]: I0319 15:18:09.177166 4771 scope.go:117] "RemoveContainer" containerID="035bbdec0b1ce9c93570410f2d19b6644ac43f18bda883ba71be3874a485d8ee" Mar 19 15:18:09 crc kubenswrapper[4771]: E0319 15:18:09.177421 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-b6zx4_openshift-ovn-kubernetes(bf31981b-d437-4216-a275-5b566d8c49aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" Mar 19 15:18:09 crc kubenswrapper[4771]: I0319 15:18:09.199280 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce98359471e6ebc3c781c53f9143d8aedb0563c958c98591c755d9423ea41d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:09Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:09 crc kubenswrapper[4771]: I0319 15:18:09.218499 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:09Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:09 crc kubenswrapper[4771]: I0319 15:18:09.249274 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://035bbdec0b1ce9c93570410f2d19b6644ac43f18bda883ba71be3874a485d8ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74c85ac463840433cb0a593c80a6d077df03146725d4c97343da426ba9bb5034\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:17:47Z\\\",\\\"message\\\":\\\"iled to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:17:46Z is after 2025-08-24T17:21:41Z]\\\\nI0319 15:17:47.001050 6818 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/network-metrics-daemon-zjhnk\\\\nI0319 15:17:47.001067 6818 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/network-metrics-daemon-zjhnk\\\\nI0319 15:17:47.001022 6818 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator-webhook]} name:Service_openshift-machine-api/machine-api-operator-webhook_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_respon\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://035bbdec0b1ce9c93570410f2d19b6644ac43f18bda883ba71be3874a485d8ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:18:08Z\\\",\\\"message\\\":\\\":18:08.381220 7037 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 15:18:08.381224 7037 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0319 15:18:08.381237 7037 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 15:18:08.381251 7037 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0319 15:18:08.381253 7037 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 15:18:08.381291 7037 factory.go:656] Stopping watch factory\\\\nI0319 15:18:08.381310 7037 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 15:18:08.381331 7037 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0319 15:18:08.381385 7037 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0319 15:18:08.381481 7037 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0319 15:18:08.381502 7037 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI0319 15:18:08.381511 7037 ovnkube.go:599] Stopped ovnkube\\\\nI0319 15:18:08.381515 7037 services_controller.go:360] Finished syncing service api on namespace openshift-apiserver for network=default : 1.95757ms\\\\nI0319 15:18:08.381534 7037 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0319 15:18:08.381608 7037 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:09Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:09 crc kubenswrapper[4771]: I0319 15:18:09.265070 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:09Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:09 crc kubenswrapper[4771]: I0319 15:18:09.282127 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2ebf37-b8db-4193-bdcb-dd9d10ba0b47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d83b8489925592a1793a8d1fdc5237fc52c2742f581701166f705577dd018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d88f1db3503c5fd2f12fc248d6274a805c322b793f1d6585968b39f5a461610e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:14Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 15:15:43.772545 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 15:15:43.775411 1 observer_polling.go:159] Starting file observer\\\\nI0319 15:15:43.816151 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 15:15:43.819414 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 15:16:14.124542 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:13Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc1650f4cc184b940a4fad0a9d7c1d593ece8735e59aed1c66f4417c2b862e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83324065bc3f8c142d9e97172aa6f22d07dc652071a0ed4365a449510d18b9e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5926799a8fe3e378dd794ebae2a622e7ea61fef68043085471dd9de44e40baa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:09Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:09 crc kubenswrapper[4771]: I0319 15:18:09.296531 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:09Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:09 crc kubenswrapper[4771]: I0319 15:18:09.310911 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4658283-2a1d-4cda-8827-354317bcc677\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77527adccf33798bec152536c72459bb99ca3e53327a4c748d7544d99d5e7e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://809563b276b55898d5e5824345fbaf17170f5c3ca405104aa5973e91cb1293b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe2d7773b455689fcc7033bd53436cd5e99831482ede8da716f5470038b84250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:09Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:09 crc kubenswrapper[4771]: I0319 15:18:09.325564 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:09Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:09 crc kubenswrapper[4771]: I0319 15:18:09.341578 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:09Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:09 crc kubenswrapper[4771]: I0319 15:18:09.354349 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:09Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:09 crc kubenswrapper[4771]: I0319 15:18:09.369636 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d1b78b7b716ebb0981f773174244d9a9583d22d06c13ee70c3057304e406e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:09Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:09 crc kubenswrapper[4771]: I0319 15:18:09.382916 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:09Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:09 crc kubenswrapper[4771]: I0319 15:18:09.393719 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:09Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:09 crc kubenswrapper[4771]: I0319 15:18:09.403966 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:09Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:09 crc kubenswrapper[4771]: I0319 15:18:09.414771 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:09Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:09 crc kubenswrapper[4771]: I0319 15:18:09.423919 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:09Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:09 crc kubenswrapper[4771]: I0319 15:18:09.436652 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:09Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:09 crc kubenswrapper[4771]: I0319 15:18:09.508320 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:09 crc kubenswrapper[4771]: I0319 15:18:09.508363 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:09 crc kubenswrapper[4771]: I0319 15:18:09.508343 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:09 crc kubenswrapper[4771]: E0319 15:18:09.508604 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:09 crc kubenswrapper[4771]: E0319 15:18:09.508661 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:18:09 crc kubenswrapper[4771]: E0319 15:18:09.508748 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.186946 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b6zx4_bf31981b-d437-4216-a275-5b566d8c49aa/ovnkube-controller/2.log" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.191682 4771 scope.go:117] "RemoveContainer" containerID="035bbdec0b1ce9c93570410f2d19b6644ac43f18bda883ba71be3874a485d8ee" Mar 19 15:18:10 crc kubenswrapper[4771]: E0319 15:18:10.192172 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-b6zx4_openshift-ovn-kubernetes(bf31981b-d437-4216-a275-5b566d8c49aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.205653 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce98359471e6ebc3c781c53f9143d8aedb0563c958c98591c755d9423ea41d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:10Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.218476 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:10Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.246934 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://035bbdec0b1ce9c93570410f2d19b6644ac43f18bda883ba71be3874a485d8ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://035bbdec0b1ce9c93570410f2d19b6644ac43f18bda883ba71be3874a485d8ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:18:08Z\\\",\\\"message\\\":\\\":18:08.381220 7037 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 15:18:08.381224 7037 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0319 15:18:08.381237 7037 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 15:18:08.381251 7037 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0319 15:18:08.381253 7037 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 15:18:08.381291 7037 factory.go:656] Stopping watch factory\\\\nI0319 15:18:08.381310 7037 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 15:18:08.381331 7037 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0319 15:18:08.381385 7037 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0319 15:18:08.381481 7037 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0319 15:18:08.381502 7037 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI0319 15:18:08.381511 7037 ovnkube.go:599] Stopped ovnkube\\\\nI0319 15:18:08.381515 7037 services_controller.go:360] Finished syncing service api on namespace openshift-apiserver for network=default : 1.95757ms\\\\nI0319 15:18:08.381534 7037 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0319 15:18:08.381608 7037 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:18:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-b6zx4_openshift-ovn-kubernetes(bf31981b-d437-4216-a275-5b566d8c49aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:10Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.258551 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:10Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.280265 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2ebf37-b8db-4193-bdcb-dd9d10ba0b47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d83b8489925592a1793a8d1fdc5237fc52c2742f581701166f705577dd018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d88f1db3503c5fd2f12fc248d6274a805c322b793f1d6585968b39f5a461610e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:14Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 15:15:43.772545 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 15:15:43.775411 1 observer_polling.go:159] Starting file observer\\\\nI0319 15:15:43.816151 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 15:15:43.819414 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 15:16:14.124542 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:13Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc1650f4cc184b940a4fad0a9d7c1d593ece8735e59aed1c66f4417c2b862e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83324065bc3f8c142d9e97172aa6f22d07dc652071a0ed4365a449510d18b9e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5926799a8fe3e378dd794ebae2a622e7ea61fef68043085471dd9de44e40baa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:10Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.298793 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:10Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.319735 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4658283-2a1d-4cda-8827-354317bcc677\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77527adccf33798bec152536c72459bb99ca3e53327a4c748d7544d99d5e7e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://809563b276b55898d5e5824345fbaf17170f5c3ca405104aa5973e91cb1293b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe2d7773b455689fcc7033bd53436cd5e99831482ede8da716f5470038b84250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:10Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.334396 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:10Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.349171 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:10Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.363459 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:10Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.387046 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d1b78b7b716ebb0981f773174244d9a9583d22d06c13ee70c3057304e406e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:10Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.403355 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:10Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.419141 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:10Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.434653 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:10Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.451896 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:10Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.471900 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:10Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.472362 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.472448 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.472501 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.472529 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.472548 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:10Z","lastTransitionTime":"2026-03-19T15:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:10 crc kubenswrapper[4771]: E0319 15:18:10.493802 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:10Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.494430 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:10Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.499574 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.499609 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.499622 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.499639 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.499651 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:10Z","lastTransitionTime":"2026-03-19T15:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.508694 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:10 crc kubenswrapper[4771]: E0319 15:18:10.508871 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:10 crc kubenswrapper[4771]: E0319 15:18:10.515428 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:10Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.520631 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.520679 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.520691 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.520708 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.521052 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:10Z","lastTransitionTime":"2026-03-19T15:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:10 crc kubenswrapper[4771]: E0319 15:18:10.541133 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:10Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.545798 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.545846 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.545864 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.545888 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.545905 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:10Z","lastTransitionTime":"2026-03-19T15:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:10 crc kubenswrapper[4771]: E0319 15:18:10.563828 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:10Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.569541 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.569610 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.569630 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.569654 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:10 crc kubenswrapper[4771]: I0319 15:18:10.569671 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:10Z","lastTransitionTime":"2026-03-19T15:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:10 crc kubenswrapper[4771]: E0319 15:18:10.586428 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:10Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:10 crc kubenswrapper[4771]: E0319 15:18:10.586577 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 15:18:11 crc kubenswrapper[4771]: I0319 15:18:11.507857 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:11 crc kubenswrapper[4771]: I0319 15:18:11.507922 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:11 crc kubenswrapper[4771]: I0319 15:18:11.508469 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:11 crc kubenswrapper[4771]: E0319 15:18:11.508620 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:11 crc kubenswrapper[4771]: E0319 15:18:11.508691 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:11 crc kubenswrapper[4771]: E0319 15:18:11.508770 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:18:11 crc kubenswrapper[4771]: I0319 15:18:11.523892 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 19 15:18:11 crc kubenswrapper[4771]: I0319 15:18:11.529271 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:11Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:11 crc kubenswrapper[4771]: I0319 15:18:11.545147 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2ebf37-b8db-4193-bdcb-dd9d10ba0b47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d83b8489925592a1793a8d1fdc5237fc52c2742f581701166f705577dd018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d88f1db3503c5fd2f12fc248d6274a805c322b793f1d6585968b39f5a461610e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:14Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 15:15:43.772545 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 15:15:43.775411 1 observer_polling.go:159] Starting file observer\\\\nI0319 15:15:43.816151 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 15:15:43.819414 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 15:16:14.124542 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:13Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc1650f4cc184b940a4fad0a9d7c1d593ece8735e59aed1c66f4417c2b862e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83324065bc3f8c142d9e97172aa6f22d07dc652071a0ed4365a449510d18b9e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5926799a8fe3e378dd794ebae2a622e7ea61fef68043085471dd9de44e40baa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:11Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:11 crc kubenswrapper[4771]: I0319 15:18:11.555959 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:11Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:11 crc kubenswrapper[4771]: I0319 15:18:11.566883 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce98359471e6ebc3c781c53f9143d8aedb0563c958c98591c755d9423ea41d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:11Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:11 crc kubenswrapper[4771]: I0319 15:18:11.583304 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:11Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:11 crc kubenswrapper[4771]: I0319 15:18:11.604099 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://035bbdec0b1ce9c93570410f2d19b6644ac43f18bda883ba71be3874a485d8ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://035bbdec0b1ce9c93570410f2d19b6644ac43f18bda883ba71be3874a485d8ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:18:08Z\\\",\\\"message\\\":\\\":18:08.381220 7037 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 15:18:08.381224 7037 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0319 15:18:08.381237 7037 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 15:18:08.381251 7037 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0319 15:18:08.381253 7037 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 15:18:08.381291 7037 factory.go:656] Stopping watch factory\\\\nI0319 15:18:08.381310 7037 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 15:18:08.381331 7037 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0319 15:18:08.381385 7037 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0319 15:18:08.381481 7037 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0319 15:18:08.381502 7037 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI0319 15:18:08.381511 7037 ovnkube.go:599] Stopped ovnkube\\\\nI0319 15:18:08.381515 7037 services_controller.go:360] Finished syncing service api on namespace openshift-apiserver for network=default : 1.95757ms\\\\nI0319 15:18:08.381534 7037 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0319 15:18:08.381608 7037 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:18:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-b6zx4_openshift-ovn-kubernetes(bf31981b-d437-4216-a275-5b566d8c49aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:11Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:11 crc kubenswrapper[4771]: I0319 15:18:11.620064 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:11Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:11 crc kubenswrapper[4771]: E0319 15:18:11.626033 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 15:18:11 crc kubenswrapper[4771]: I0319 15:18:11.645775 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d1b78b7b716ebb0981f773174244d9a9583d22d06c13ee70c3057304e406e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:11Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:11 crc kubenswrapper[4771]: I0319 15:18:11.667105 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:11Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:11 crc kubenswrapper[4771]: I0319 15:18:11.678478 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4658283-2a1d-4cda-8827-354317bcc677\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77527adccf33798bec152536c72459bb99ca3e53327a4c748d7544d99d5e7e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://809563b276b55898d5e5824345fbaf17170f5c3ca405104aa5973e91cb1293b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe2d7773b455689fcc7033bd53436cd5e99831482ede8da716f5470038b84250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:11Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:11 crc kubenswrapper[4771]: I0319 15:18:11.688908 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:11Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:11 crc kubenswrapper[4771]: I0319 15:18:11.702892 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:11Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:11 crc kubenswrapper[4771]: I0319 15:18:11.714717 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:11Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:11 crc kubenswrapper[4771]: I0319 15:18:11.732250 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:11Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:11 crc kubenswrapper[4771]: I0319 15:18:11.744377 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:11Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:11 crc kubenswrapper[4771]: I0319 15:18:11.756899 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:11Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:11 crc kubenswrapper[4771]: I0319 15:18:11.770600 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:11Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:12 crc kubenswrapper[4771]: I0319 15:18:12.507950 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:12 crc kubenswrapper[4771]: E0319 15:18:12.508195 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:13 crc kubenswrapper[4771]: I0319 15:18:13.508688 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:13 crc kubenswrapper[4771]: I0319 15:18:13.508709 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:13 crc kubenswrapper[4771]: I0319 15:18:13.509017 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:13 crc kubenswrapper[4771]: E0319 15:18:13.509094 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:13 crc kubenswrapper[4771]: E0319 15:18:13.508902 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:13 crc kubenswrapper[4771]: E0319 15:18:13.509204 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:18:14 crc kubenswrapper[4771]: I0319 15:18:14.507901 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:14 crc kubenswrapper[4771]: E0319 15:18:14.508169 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:15 crc kubenswrapper[4771]: I0319 15:18:15.508231 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:15 crc kubenswrapper[4771]: I0319 15:18:15.508331 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:15 crc kubenswrapper[4771]: I0319 15:18:15.508338 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:15 crc kubenswrapper[4771]: E0319 15:18:15.508478 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:15 crc kubenswrapper[4771]: E0319 15:18:15.508693 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:18:15 crc kubenswrapper[4771]: E0319 15:18:15.508825 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:16 crc kubenswrapper[4771]: I0319 15:18:16.508337 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:16 crc kubenswrapper[4771]: E0319 15:18:16.508491 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:16 crc kubenswrapper[4771]: E0319 15:18:16.627033 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 15:18:17 crc kubenswrapper[4771]: I0319 15:18:17.508110 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:17 crc kubenswrapper[4771]: I0319 15:18:17.508179 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:17 crc kubenswrapper[4771]: E0319 15:18:17.508249 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:18:17 crc kubenswrapper[4771]: E0319 15:18:17.508306 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:17 crc kubenswrapper[4771]: I0319 15:18:17.508491 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:17 crc kubenswrapper[4771]: E0319 15:18:17.508559 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:18 crc kubenswrapper[4771]: I0319 15:18:18.508505 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:18 crc kubenswrapper[4771]: E0319 15:18:18.508685 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:19 crc kubenswrapper[4771]: I0319 15:18:19.345807 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:18:19 crc kubenswrapper[4771]: I0319 15:18:19.345955 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:19 crc kubenswrapper[4771]: I0319 15:18:19.346022 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:19 crc kubenswrapper[4771]: E0319 15:18:19.346116 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 15:18:19 crc kubenswrapper[4771]: E0319 15:18:19.346167 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 15:19:23.346152459 +0000 UTC m=+222.574773661 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 15:18:19 crc kubenswrapper[4771]: E0319 15:18:19.346340 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:23.346331923 +0000 UTC m=+222.574953115 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:18:19 crc kubenswrapper[4771]: E0319 15:18:19.346397 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 15:18:19 crc kubenswrapper[4771]: E0319 15:18:19.346419 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 15:19:23.346413245 +0000 UTC m=+222.575034447 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 15:18:19 crc kubenswrapper[4771]: I0319 15:18:19.447415 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:19 crc kubenswrapper[4771]: I0319 15:18:19.447496 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:19 crc kubenswrapper[4771]: E0319 15:18:19.447638 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 15:18:19 crc kubenswrapper[4771]: E0319 15:18:19.447673 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 15:18:19 crc kubenswrapper[4771]: E0319 15:18:19.447685 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:18:19 crc kubenswrapper[4771]: E0319 15:18:19.447734 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 15:18:19 crc kubenswrapper[4771]: E0319 15:18:19.447768 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 15:18:19 crc kubenswrapper[4771]: E0319 15:18:19.447792 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:18:19 crc kubenswrapper[4771]: E0319 15:18:19.447747 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 15:19:23.447729169 +0000 UTC m=+222.676350461 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:18:19 crc kubenswrapper[4771]: E0319 15:18:19.447874 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 15:19:23.447852762 +0000 UTC m=+222.676474004 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:18:19 crc kubenswrapper[4771]: I0319 15:18:19.507587 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:19 crc kubenswrapper[4771]: I0319 15:18:19.507622 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:19 crc kubenswrapper[4771]: I0319 15:18:19.507622 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:19 crc kubenswrapper[4771]: E0319 15:18:19.507718 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:19 crc kubenswrapper[4771]: E0319 15:18:19.507820 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:18:19 crc kubenswrapper[4771]: E0319 15:18:19.507869 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:20 crc kubenswrapper[4771]: I0319 15:18:20.507758 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:20 crc kubenswrapper[4771]: E0319 15:18:20.507969 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:20 crc kubenswrapper[4771]: I0319 15:18:20.666880 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:20 crc kubenswrapper[4771]: I0319 15:18:20.667257 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:20 crc kubenswrapper[4771]: I0319 15:18:20.667397 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:20 crc kubenswrapper[4771]: I0319 15:18:20.667537 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:20 crc kubenswrapper[4771]: I0319 15:18:20.667668 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:20Z","lastTransitionTime":"2026-03-19T15:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:20 crc kubenswrapper[4771]: E0319 15:18:20.687505 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:20Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:20 crc kubenswrapper[4771]: I0319 15:18:20.691950 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:20 crc kubenswrapper[4771]: I0319 15:18:20.692016 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:20 crc kubenswrapper[4771]: I0319 15:18:20.692033 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:20 crc kubenswrapper[4771]: I0319 15:18:20.692055 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:20 crc kubenswrapper[4771]: I0319 15:18:20.692072 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:20Z","lastTransitionTime":"2026-03-19T15:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:20 crc kubenswrapper[4771]: E0319 15:18:20.708089 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:20Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:20 crc kubenswrapper[4771]: I0319 15:18:20.712579 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:20 crc kubenswrapper[4771]: I0319 15:18:20.712626 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:20 crc kubenswrapper[4771]: I0319 15:18:20.712637 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:20 crc kubenswrapper[4771]: I0319 15:18:20.712653 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:20 crc kubenswrapper[4771]: I0319 15:18:20.712665 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:20Z","lastTransitionTime":"2026-03-19T15:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:20 crc kubenswrapper[4771]: E0319 15:18:20.731572 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:20Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:20 crc kubenswrapper[4771]: I0319 15:18:20.741240 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:20 crc kubenswrapper[4771]: I0319 15:18:20.741276 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:20 crc kubenswrapper[4771]: I0319 15:18:20.741307 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:20 crc kubenswrapper[4771]: I0319 15:18:20.741324 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:20 crc kubenswrapper[4771]: I0319 15:18:20.741336 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:20Z","lastTransitionTime":"2026-03-19T15:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:20 crc kubenswrapper[4771]: E0319 15:18:20.760567 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:20Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:20 crc kubenswrapper[4771]: I0319 15:18:20.765231 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:20 crc kubenswrapper[4771]: I0319 15:18:20.765441 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:20 crc kubenswrapper[4771]: I0319 15:18:20.765587 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:20 crc kubenswrapper[4771]: I0319 15:18:20.765745 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:20 crc kubenswrapper[4771]: I0319 15:18:20.765901 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:20Z","lastTransitionTime":"2026-03-19T15:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:20 crc kubenswrapper[4771]: E0319 15:18:20.785896 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:20Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:20 crc kubenswrapper[4771]: E0319 15:18:20.786471 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 15:18:21 crc kubenswrapper[4771]: I0319 15:18:21.507682 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:21 crc kubenswrapper[4771]: I0319 15:18:21.507753 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:21 crc kubenswrapper[4771]: I0319 15:18:21.507854 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:21 crc kubenswrapper[4771]: E0319 15:18:21.507856 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:21 crc kubenswrapper[4771]: E0319 15:18:21.507926 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:18:21 crc kubenswrapper[4771]: E0319 15:18:21.508161 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:21 crc kubenswrapper[4771]: I0319 15:18:21.532327 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7968c7aa-c5fd-4cd9-b265-c9b7527a86dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454b435a5eabdf4a64f4e08ed67ad3926ea1c357ff0bec140aaf1b21bfd6a5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96547efd86e809f3eeaecd398a7ce82d58b99e95ee29d5dc0a08abae402c28cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56b40f0d1b17c4f9997ef2713d4a5ecf59283afa491298222f7c6fef56fd2aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23395a26fca7d054420d5106512839bed7889bc626449e23186e85233c76a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950511c16fe7428ba42f181865b5f6d48c15a001fd40682fe33cb19fabb07482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8f745656bf82c5fd38635d48c5b7b34483a992c0e69091d391136f4f2a22b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de8f745656bf82c5fd38635d48c5b7b34483a992c0e69091d391136f4f2a22b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcd904e4f9d721ba401d6244137a4698ce5197dcc2daadb2e68014f0bca1ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdcd904e4f9d721ba401d6244137a4698ce5197dcc2daadb2e68014f0bca1ff9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6f6af54a4c433b2f583c08b1a7e50542298041f410c9190fd8f348b5275d8a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f6af54a4c433b2f583c08b1a7e50542298041f410c9190fd8f348b5275d8a08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:21Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:21 crc kubenswrapper[4771]: I0319 15:18:21.545976 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:21Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:21 crc kubenswrapper[4771]: I0319 15:18:21.561799 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:21Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:21 crc kubenswrapper[4771]: I0319 15:18:21.573696 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:21Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:21 crc kubenswrapper[4771]: I0319 15:18:21.586941 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:21Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:21 crc kubenswrapper[4771]: I0319 15:18:21.598458 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:21Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:21 crc kubenswrapper[4771]: I0319 15:18:21.611812 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2ebf37-b8db-4193-bdcb-dd9d10ba0b47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d83b8489925592a1793a8d1fdc5237fc52c2742f581701166f705577dd018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d88f1db3503c5fd2f12fc248d6274a805c322b793f1d6585968b39f5a461610e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:14Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 15:15:43.772545 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 15:15:43.775411 1 observer_polling.go:159] Starting file observer\\\\nI0319 15:15:43.816151 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 15:15:43.819414 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 15:16:14.124542 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:13Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc1650f4cc184b940a4fad0a9d7c1d593ece8735e59aed1c66f4417c2b862e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83324065bc3f8c142d9e97172aa6f22d07dc652071a0ed4365a449510d18b9e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5926799a8fe3e378dd794ebae2a622e7ea61fef68043085471dd9de44e40baa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:21Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:21 crc kubenswrapper[4771]: I0319 15:18:21.620770 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:21Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:21 crc kubenswrapper[4771]: E0319 15:18:21.627753 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 15:18:21 crc kubenswrapper[4771]: I0319 15:18:21.637508 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce98359471e6ebc3c781c53f9143d8aedb0563c958c98591c755d9423ea41d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:21Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:21 crc kubenswrapper[4771]: I0319 15:18:21.651573 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:21Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:21 crc kubenswrapper[4771]: I0319 15:18:21.670070 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://035bbdec0b1ce9c93570410f2d19b6644ac43f18bda883ba71be3874a485d8ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://035bbdec0b1ce9c93570410f2d19b6644ac43f18bda883ba71be3874a485d8ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:18:08Z\\\",\\\"message\\\":\\\":18:08.381220 7037 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 15:18:08.381224 7037 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0319 15:18:08.381237 7037 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 15:18:08.381251 7037 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0319 15:18:08.381253 7037 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 15:18:08.381291 7037 factory.go:656] Stopping watch factory\\\\nI0319 15:18:08.381310 7037 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 15:18:08.381331 7037 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0319 15:18:08.381385 7037 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0319 15:18:08.381481 7037 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0319 15:18:08.381502 7037 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI0319 15:18:08.381511 7037 ovnkube.go:599] Stopped ovnkube\\\\nI0319 15:18:08.381515 7037 services_controller.go:360] Finished syncing service api on namespace openshift-apiserver for network=default : 1.95757ms\\\\nI0319 15:18:08.381534 7037 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0319 15:18:08.381608 7037 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:18:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-b6zx4_openshift-ovn-kubernetes(bf31981b-d437-4216-a275-5b566d8c49aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:21Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:21 crc kubenswrapper[4771]: I0319 15:18:21.680474 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:21Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:21 crc kubenswrapper[4771]: I0319 15:18:21.693395 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:21Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:21 crc kubenswrapper[4771]: I0319 15:18:21.704669 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4658283-2a1d-4cda-8827-354317bcc677\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77527adccf33798bec152536c72459bb99ca3e53327a4c748d7544d99d5e7e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://809563b276b55898d5e5824345fbaf17170f5c3ca405104aa5973e91cb1293b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe2d7773b455689fcc7033bd53436cd5e99831482ede8da716f5470038b84250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:21Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:21 crc kubenswrapper[4771]: I0319 15:18:21.717090 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:21Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:21 crc kubenswrapper[4771]: I0319 15:18:21.729228 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:21Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:21 crc kubenswrapper[4771]: I0319 15:18:21.740286 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:21Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:21 crc kubenswrapper[4771]: I0319 15:18:21.753682 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d1b78b7b716ebb0981f773174244d9a9583d22d06c13ee70c3057304e406e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:21Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:22 crc kubenswrapper[4771]: I0319 15:18:22.508398 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:22 crc kubenswrapper[4771]: E0319 15:18:22.508531 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:23 crc kubenswrapper[4771]: I0319 15:18:23.508035 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:23 crc kubenswrapper[4771]: I0319 15:18:23.508122 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:23 crc kubenswrapper[4771]: E0319 15:18:23.508204 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:23 crc kubenswrapper[4771]: I0319 15:18:23.508287 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:23 crc kubenswrapper[4771]: E0319 15:18:23.508474 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:18:23 crc kubenswrapper[4771]: E0319 15:18:23.508633 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:24 crc kubenswrapper[4771]: I0319 15:18:24.239539 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9989m_51f8c2de-454d-4b7c-bf30-2f5d12d7088e/kube-multus/0.log" Mar 19 15:18:24 crc kubenswrapper[4771]: I0319 15:18:24.239593 4771 generic.go:334] "Generic (PLEG): container finished" podID="51f8c2de-454d-4b7c-bf30-2f5d12d7088e" containerID="b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9" exitCode=1 Mar 19 15:18:24 crc kubenswrapper[4771]: I0319 15:18:24.239633 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9989m" event={"ID":"51f8c2de-454d-4b7c-bf30-2f5d12d7088e","Type":"ContainerDied","Data":"b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9"} Mar 19 15:18:24 crc kubenswrapper[4771]: I0319 15:18:24.240044 4771 scope.go:117] "RemoveContainer" containerID="b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9" Mar 19 15:18:24 crc kubenswrapper[4771]: I0319 15:18:24.252067 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:24Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:24 crc kubenswrapper[4771]: I0319 15:18:24.272911 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2ebf37-b8db-4193-bdcb-dd9d10ba0b47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d83b8489925592a1793a8d1fdc5237fc52c2742f581701166f705577dd018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d88f1db3503c5fd2f12fc248d6274a805c322b793f1d6585968b39f5a461610e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:14Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 15:15:43.772545 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 15:15:43.775411 1 observer_polling.go:159] Starting file observer\\\\nI0319 15:15:43.816151 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 15:15:43.819414 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 15:16:14.124542 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:13Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc1650f4cc184b940a4fad0a9d7c1d593ece8735e59aed1c66f4417c2b862e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83324065bc3f8c142d9e97172aa6f22d07dc652071a0ed4365a449510d18b9e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5926799a8fe3e378dd794ebae2a622e7ea61fef68043085471dd9de44e40baa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:24Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:24 crc kubenswrapper[4771]: I0319 15:18:24.287018 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:24Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:24 crc kubenswrapper[4771]: I0319 15:18:24.304507 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce98359471e6ebc3c781c53f9143d8aedb0563c958c98591c755d9423ea41d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:24Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:24 crc kubenswrapper[4771]: I0319 15:18:24.323862 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:18:23Z\\\",\\\"message\\\":\\\"2026-03-19T15:17:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_846f6fd3-ce25-45ab-911b-9edc4950aa85\\\\n2026-03-19T15:17:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_846f6fd3-ce25-45ab-911b-9edc4950aa85 to /host/opt/cni/bin/\\\\n2026-03-19T15:17:38Z [verbose] multus-daemon started\\\\n2026-03-19T15:17:38Z [verbose] Readiness Indicator file check\\\\n2026-03-19T15:18:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:24Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:24 crc kubenswrapper[4771]: I0319 15:18:24.350742 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://035bbdec0b1ce9c93570410f2d19b6644ac43f18bda883ba71be3874a485d8ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://035bbdec0b1ce9c93570410f2d19b6644ac43f18bda883ba71be3874a485d8ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:18:08Z\\\",\\\"message\\\":\\\":18:08.381220 7037 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 15:18:08.381224 7037 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0319 15:18:08.381237 7037 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 15:18:08.381251 7037 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0319 15:18:08.381253 7037 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 15:18:08.381291 7037 factory.go:656] Stopping watch factory\\\\nI0319 15:18:08.381310 7037 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 15:18:08.381331 7037 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0319 15:18:08.381385 7037 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0319 15:18:08.381481 7037 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0319 15:18:08.381502 7037 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI0319 15:18:08.381511 7037 ovnkube.go:599] Stopped ovnkube\\\\nI0319 15:18:08.381515 7037 services_controller.go:360] Finished syncing service api on namespace openshift-apiserver for network=default : 1.95757ms\\\\nI0319 15:18:08.381534 7037 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0319 15:18:08.381608 7037 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:18:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-b6zx4_openshift-ovn-kubernetes(bf31981b-d437-4216-a275-5b566d8c49aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:24Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:24 crc kubenswrapper[4771]: I0319 15:18:24.364461 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:24Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:24 crc kubenswrapper[4771]: I0319 15:18:24.384413 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d1b78b7b716ebb0981f773174244d9a9583d22d06c13ee70c3057304e406e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:24Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:24 crc kubenswrapper[4771]: I0319 15:18:24.404059 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:24Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:24 crc kubenswrapper[4771]: I0319 15:18:24.419124 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4658283-2a1d-4cda-8827-354317bcc677\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77527adccf33798bec152536c72459bb99ca3e53327a4c748d7544d99d5e7e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://809563b276b55898d5e5824345fbaf17170f5c3ca405104aa5973e91cb1293b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe2d7773b455689fcc7033bd53436cd5e99831482ede8da716f5470038b84250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:24Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:24 crc kubenswrapper[4771]: I0319 15:18:24.431959 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:24Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:24 crc kubenswrapper[4771]: I0319 15:18:24.446071 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:24Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:24 crc kubenswrapper[4771]: I0319 15:18:24.458191 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:24Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:24 crc kubenswrapper[4771]: I0319 15:18:24.477033 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7968c7aa-c5fd-4cd9-b265-c9b7527a86dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454b435a5eabdf4a64f4e08ed67ad3926ea1c357ff0bec140aaf1b21bfd6a5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96547efd86e809f3eeaecd398a7ce82d58b99e95ee29d5dc0a08abae402c28cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56b40f0d1b17c4f9997ef2713d4a5ecf59283afa491298222f7c6fef56fd2aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23395a26fca7d054420d5106512839bed7889bc626449e23186e85233c76a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950511c16fe7428ba42f181865b5f6d48c15a001fd40682fe33cb19fabb07482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8f745656bf82c5fd38635d48c5b7b34483a992c0e69091d391136f4f2a22b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de8f745656bf82c5fd38635d48c5b7b34483a992c0e69091d391136f4f2a22b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcd904e4f9d721ba401d6244137a4698ce5197dcc2daadb2e68014f0bca1ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdcd904e4f9d721ba401d6244137a4698ce5197dcc2daadb2e68014f0bca1ff9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6f6af54a4c433b2f583c08b1a7e50542298041f410c9190fd8f348b5275d8a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f6af54a4c433b2f583c08b1a7e50542298041f410c9190fd8f348b5275d8a08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:24Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:24 crc kubenswrapper[4771]: I0319 15:18:24.492719 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:24Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:24 crc kubenswrapper[4771]: I0319 15:18:24.506914 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:24Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:24 crc kubenswrapper[4771]: I0319 15:18:24.508698 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:24 crc kubenswrapper[4771]: E0319 15:18:24.508826 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:24 crc kubenswrapper[4771]: I0319 15:18:24.509042 4771 scope.go:117] "RemoveContainer" containerID="035bbdec0b1ce9c93570410f2d19b6644ac43f18bda883ba71be3874a485d8ee" Mar 19 15:18:24 crc kubenswrapper[4771]: E0319 15:18:24.509175 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-b6zx4_openshift-ovn-kubernetes(bf31981b-d437-4216-a275-5b566d8c49aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" Mar 19 15:18:24 crc kubenswrapper[4771]: I0319 15:18:24.519367 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:24Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:24 crc kubenswrapper[4771]: I0319 15:18:24.536117 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:24Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:25 crc kubenswrapper[4771]: I0319 15:18:25.244632 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9989m_51f8c2de-454d-4b7c-bf30-2f5d12d7088e/kube-multus/0.log" Mar 19 15:18:25 crc kubenswrapper[4771]: I0319 15:18:25.244688 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9989m" event={"ID":"51f8c2de-454d-4b7c-bf30-2f5d12d7088e","Type":"ContainerStarted","Data":"65f7ff3b147b68b53a4ab6e3fc6c7b1b5f1c61d11dc5bfab7b3d92a638fecbb2"} Mar 19 15:18:25 crc kubenswrapper[4771]: I0319 15:18:25.263053 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:25Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:25 crc kubenswrapper[4771]: I0319 15:18:25.283956 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:25Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:25 crc kubenswrapper[4771]: I0319 15:18:25.306117 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:25Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:25 crc kubenswrapper[4771]: I0319 15:18:25.330305 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d1b78b7b716ebb0981f773174244d9a9583d22d06c13ee70c3057304e406e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:25Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:25 crc kubenswrapper[4771]: I0319 15:18:25.352167 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:25Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:25 crc kubenswrapper[4771]: I0319 15:18:25.371785 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4658283-2a1d-4cda-8827-354317bcc677\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77527adccf33798bec152536c72459bb99ca3e53327a4c748d7544d99d5e7e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://809563b276b55898d5e5824345fbaf17170f5c3ca405104aa5973e91cb1293b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe2d7773b455689fcc7033bd53436cd5e99831482ede8da716f5470038b84250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:25Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:25 crc kubenswrapper[4771]: I0319 15:18:25.387816 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:25Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:25 crc kubenswrapper[4771]: I0319 15:18:25.403973 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:25Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:25 crc kubenswrapper[4771]: I0319 15:18:25.417061 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:25Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:25 crc kubenswrapper[4771]: I0319 15:18:25.451860 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7968c7aa-c5fd-4cd9-b265-c9b7527a86dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454b435a5eabdf4a64f4e08ed67ad3926ea1c357ff0bec140aaf1b21bfd6a5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96547efd86e809f3eeaecd398a7ce82d58b99e95ee29d5dc0a08abae402c28cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56b40f0d1b17c4f9997ef2713d4a5ecf59283afa491298222f7c6fef56fd2aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23395a26fca7d054420d5106512839bed7889bc626449e23186e85233c76a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950511c16fe7428ba42f181865b5f6d48c15a001fd40682fe33cb19fabb07482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8f745656bf82c5fd38635d48c5b7b34483a992c0e69091d391136f4f2a22b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de8f745656bf82c5fd38635d48c5b7b34483a992c0e69091d391136f4f2a22b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcd904e4f9d721ba401d6244137a4698ce5197dcc2daadb2e68014f0bca1ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdcd904e4f9d721ba401d6244137a4698ce5197dcc2daadb2e68014f0bca1ff9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6f6af54a4c433b2f583c08b1a7e50542298041f410c9190fd8f348b5275d8a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f6af54a4c433b2f583c08b1a7e50542298041f410c9190fd8f348b5275d8a08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:25Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:25 crc kubenswrapper[4771]: I0319 15:18:25.471179 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:25Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:25 crc kubenswrapper[4771]: I0319 15:18:25.490330 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:25Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:25 crc kubenswrapper[4771]: I0319 15:18:25.505220 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f7ff3b147b68b53a4ab6e3fc6c7b1b5f1c61d11dc5bfab7b3d92a638fecbb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:18:23Z\\\",\\\"message\\\":\\\"2026-03-19T15:17:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_846f6fd3-ce25-45ab-911b-9edc4950aa85\\\\n2026-03-19T15:17:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_846f6fd3-ce25-45ab-911b-9edc4950aa85 to /host/opt/cni/bin/\\\\n2026-03-19T15:17:38Z [verbose] multus-daemon started\\\\n2026-03-19T15:17:38Z [verbose] Readiness Indicator file check\\\\n2026-03-19T15:18:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:25Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:25 crc kubenswrapper[4771]: I0319 15:18:25.508497 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:25 crc kubenswrapper[4771]: I0319 15:18:25.508511 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:25 crc kubenswrapper[4771]: E0319 15:18:25.508591 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:25 crc kubenswrapper[4771]: E0319 15:18:25.508792 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:25 crc kubenswrapper[4771]: I0319 15:18:25.508457 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:25 crc kubenswrapper[4771]: E0319 15:18:25.508942 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:18:25 crc kubenswrapper[4771]: I0319 15:18:25.528456 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://035bbdec0b1ce9c93570410f2d19b6644ac43f18bda883ba71be3874a485d8ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://035bbdec0b1ce9c93570410f2d19b6644ac43f18bda883ba71be3874a485d8ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:18:08Z\\\",\\\"message\\\":\\\":18:08.381220 7037 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 15:18:08.381224 7037 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0319 15:18:08.381237 7037 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 15:18:08.381251 7037 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0319 15:18:08.381253 7037 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 15:18:08.381291 7037 factory.go:656] Stopping watch factory\\\\nI0319 15:18:08.381310 7037 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 15:18:08.381331 7037 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0319 15:18:08.381385 7037 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0319 15:18:08.381481 7037 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0319 15:18:08.381502 7037 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI0319 15:18:08.381511 7037 ovnkube.go:599] Stopped ovnkube\\\\nI0319 15:18:08.381515 7037 services_controller.go:360] Finished syncing service api on namespace openshift-apiserver for network=default : 1.95757ms\\\\nI0319 15:18:08.381534 7037 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0319 15:18:08.381608 7037 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:18:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-b6zx4_openshift-ovn-kubernetes(bf31981b-d437-4216-a275-5b566d8c49aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:25Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:25 crc kubenswrapper[4771]: I0319 15:18:25.540960 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:25Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:25 crc kubenswrapper[4771]: I0319 15:18:25.553038 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2ebf37-b8db-4193-bdcb-dd9d10ba0b47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d83b8489925592a1793a8d1fdc5237fc52c2742f581701166f705577dd018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d88f1db3503c5fd2f12fc248d6274a805c322b793f1d6585968b39f5a461610e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:14Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 15:15:43.772545 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 15:15:43.775411 1 observer_polling.go:159] Starting file observer\\\\nI0319 15:15:43.816151 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 15:15:43.819414 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 15:16:14.124542 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:13Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc1650f4cc184b940a4fad0a9d7c1d593ece8735e59aed1c66f4417c2b862e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83324065bc3f8c142d9e97172aa6f22d07dc652071a0ed4365a449510d18b9e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5926799a8fe3e378dd794ebae2a622e7ea61fef68043085471dd9de44e40baa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:25Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:25 crc kubenswrapper[4771]: I0319 15:18:25.563579 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:25Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:25 crc kubenswrapper[4771]: I0319 15:18:25.573521 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce98359471e6ebc3c781c53f9143d8aedb0563c958c98591c755d9423ea41d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:25Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:26 crc kubenswrapper[4771]: I0319 15:18:26.508605 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:26 crc kubenswrapper[4771]: E0319 15:18:26.508727 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:26 crc kubenswrapper[4771]: E0319 15:18:26.641969 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 15:18:27 crc kubenswrapper[4771]: I0319 15:18:27.507816 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:27 crc kubenswrapper[4771]: I0319 15:18:27.507878 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:27 crc kubenswrapper[4771]: E0319 15:18:27.507948 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:27 crc kubenswrapper[4771]: I0319 15:18:27.508011 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:27 crc kubenswrapper[4771]: E0319 15:18:27.508105 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:18:27 crc kubenswrapper[4771]: E0319 15:18:27.508311 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:28 crc kubenswrapper[4771]: I0319 15:18:28.508098 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:28 crc kubenswrapper[4771]: E0319 15:18:28.508222 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:29 crc kubenswrapper[4771]: I0319 15:18:29.508213 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:29 crc kubenswrapper[4771]: E0319 15:18:29.508340 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:29 crc kubenswrapper[4771]: I0319 15:18:29.508502 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:29 crc kubenswrapper[4771]: I0319 15:18:29.508927 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:29 crc kubenswrapper[4771]: E0319 15:18:29.509019 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:18:29 crc kubenswrapper[4771]: E0319 15:18:29.509156 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:29 crc kubenswrapper[4771]: I0319 15:18:29.521863 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 19 15:18:30 crc kubenswrapper[4771]: I0319 15:18:30.508154 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:30 crc kubenswrapper[4771]: E0319 15:18:30.508324 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.001846 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.001906 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.001925 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.001948 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.001967 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:31Z","lastTransitionTime":"2026-03-19T15:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:31 crc kubenswrapper[4771]: E0319 15:18:31.019980 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.024199 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.024251 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.024262 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.024277 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.024289 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:31Z","lastTransitionTime":"2026-03-19T15:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:31 crc kubenswrapper[4771]: E0319 15:18:31.042090 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.046594 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.046687 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.046706 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.046763 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.046784 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:31Z","lastTransitionTime":"2026-03-19T15:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:31 crc kubenswrapper[4771]: E0319 15:18:31.062135 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.067521 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.067560 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.067575 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.067592 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.067603 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:31Z","lastTransitionTime":"2026-03-19T15:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:31 crc kubenswrapper[4771]: E0319 15:18:31.083412 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.088144 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.088209 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.088232 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.088261 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.088282 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:31Z","lastTransitionTime":"2026-03-19T15:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:31 crc kubenswrapper[4771]: E0319 15:18:31.108699 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:31 crc kubenswrapper[4771]: E0319 15:18:31.108928 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.507691 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.507695 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.507816 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:31 crc kubenswrapper[4771]: E0319 15:18:31.507921 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:18:31 crc kubenswrapper[4771]: E0319 15:18:31.508130 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:31 crc kubenswrapper[4771]: E0319 15:18:31.508266 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.544842 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7968c7aa-c5fd-4cd9-b265-c9b7527a86dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454b435a5eabdf4a64f4e08ed67ad3926ea1c357ff0bec140aaf1b21bfd6a5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96547efd86e809f3eeaecd398a7ce82d58b99e95ee29d5dc0a08abae402c28cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56b40f0d1b17c4f9997ef2713d4a5ecf59283afa491298222f7c6fef56fd2aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23395a26fca7d054420d5106512839bed7889bc626449e23186e85233c76a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950511c16fe7428ba42f181865b5f6d48c15a001fd40682fe33cb19fabb07482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8f745656bf82c5fd38635d48c5b7b34483a992c0e69091d391136f4f2a22b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de8f745656bf82c5fd38635d48c5b7b34483a992c0e69091d391136f4f2a22b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcd904e4f9d721ba401d6244137a4698ce5197dcc2daadb2e68014f0bca1ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdcd904e4f9d721ba401d6244137a4698ce5197dcc2daadb2e68014f0bca1ff9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6f6af54a4c433b2f583c08b1a7e50542298041f410c9190fd8f348b5275d8a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f6af54a4c433b2f583c08b1a7e50542298041f410c9190fd8f348b5275d8a08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.564950 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.582050 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.595928 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.608545 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.622658 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:31 crc kubenswrapper[4771]: E0319 15:18:31.644168 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.645821 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2ebf37-b8db-4193-bdcb-dd9d10ba0b47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d83b8489925592a1793a8d1fdc5237fc52c2742f581701166f705577dd018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d88f1db3503c5fd2f12fc248d6274a805c322b793f1d6585968b39f5a461610e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:14Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 15:15:43.772545 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 15:15:43.775411 1 observer_polling.go:159] Starting file observer\\\\nI0319 15:15:43.816151 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 15:15:43.819414 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 15:16:14.124542 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:13Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc1650f4cc184b940a4fad0a9d7c1d593ece8735e59aed1c66f4417c2b862e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83324065bc3f8c142d9e97172aa6f22d07dc652071a0ed4365a449510d18b9e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5926799a8fe3e378dd794ebae2a622e7ea61fef68043085471dd9de44e40baa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.660135 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.671821 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce98359471e6ebc3c781c53f9143d8aedb0563c958c98591c755d9423ea41d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.685298 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f7ff3b147b68b53a4ab6e3fc6c7b1b5f1c61d11dc5bfab7b3d92a638fecbb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:18:23Z\\\",\\\"message\\\":\\\"2026-03-19T15:17:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_846f6fd3-ce25-45ab-911b-9edc4950aa85\\\\n2026-03-19T15:17:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_846f6fd3-ce25-45ab-911b-9edc4950aa85 to /host/opt/cni/bin/\\\\n2026-03-19T15:17:38Z [verbose] multus-daemon started\\\\n2026-03-19T15:17:38Z [verbose] Readiness Indicator file check\\\\n2026-03-19T15:18:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.712528 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://035bbdec0b1ce9c93570410f2d19b6644ac43f18bda883ba71be3874a485d8ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://035bbdec0b1ce9c93570410f2d19b6644ac43f18bda883ba71be3874a485d8ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:18:08Z\\\",\\\"message\\\":\\\":18:08.381220 7037 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 15:18:08.381224 7037 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0319 15:18:08.381237 7037 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 15:18:08.381251 7037 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0319 15:18:08.381253 7037 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 15:18:08.381291 7037 factory.go:656] Stopping watch factory\\\\nI0319 15:18:08.381310 7037 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 15:18:08.381331 7037 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0319 15:18:08.381385 7037 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0319 15:18:08.381481 7037 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0319 15:18:08.381502 7037 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI0319 15:18:08.381511 7037 ovnkube.go:599] Stopped ovnkube\\\\nI0319 15:18:08.381515 7037 services_controller.go:360] Finished syncing service api on namespace openshift-apiserver for network=default : 1.95757ms\\\\nI0319 15:18:08.381534 7037 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0319 15:18:08.381608 7037 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:18:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-b6zx4_openshift-ovn-kubernetes(bf31981b-d437-4216-a275-5b566d8c49aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.724626 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.742038 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d1b78b7b716ebb0981f773174244d9a9583d22d06c13ee70c3057304e406e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.753021 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c7c8c95-d715-48ad-9244-c035db503075\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7c0836517f4ced7fafcb12ff073d4ef8885f18aec49a1e28404bb23103c42c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a796f2dbeb7d1bba7c52e85cb054117dbe63faa3482fa5cd9e604d11bc54f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a796f2dbeb7d1bba7c52e85cb054117dbe63faa3482fa5cd9e604d11bc54f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.766499 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.781398 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4658283-2a1d-4cda-8827-354317bcc677\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77527adccf33798bec152536c72459bb99ca3e53327a4c748d7544d99d5e7e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://809563b276b55898d5e5824345fbaf17170f5c3ca405104aa5973e91cb1293b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe2d7773b455689fcc7033bd53436cd5e99831482ede8da716f5470038b84250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.794597 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.808249 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:31 crc kubenswrapper[4771]: I0319 15:18:31.822258 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:31Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:32 crc kubenswrapper[4771]: I0319 15:18:32.508534 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:32 crc kubenswrapper[4771]: E0319 15:18:32.508642 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:33 crc kubenswrapper[4771]: I0319 15:18:33.507927 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:33 crc kubenswrapper[4771]: I0319 15:18:33.508067 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:33 crc kubenswrapper[4771]: E0319 15:18:33.508218 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:33 crc kubenswrapper[4771]: I0319 15:18:33.508411 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:33 crc kubenswrapper[4771]: E0319 15:18:33.508422 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:33 crc kubenswrapper[4771]: E0319 15:18:33.508603 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:18:34 crc kubenswrapper[4771]: I0319 15:18:34.508083 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:34 crc kubenswrapper[4771]: E0319 15:18:34.508287 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:35 crc kubenswrapper[4771]: I0319 15:18:35.508834 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:35 crc kubenswrapper[4771]: I0319 15:18:35.508926 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:35 crc kubenswrapper[4771]: I0319 15:18:35.508966 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:35 crc kubenswrapper[4771]: E0319 15:18:35.509069 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:35 crc kubenswrapper[4771]: E0319 15:18:35.509259 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:18:35 crc kubenswrapper[4771]: E0319 15:18:35.509305 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:35 crc kubenswrapper[4771]: I0319 15:18:35.510106 4771 scope.go:117] "RemoveContainer" containerID="035bbdec0b1ce9c93570410f2d19b6644ac43f18bda883ba71be3874a485d8ee" Mar 19 15:18:36 crc kubenswrapper[4771]: I0319 15:18:36.281752 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b6zx4_bf31981b-d437-4216-a275-5b566d8c49aa/ovnkube-controller/2.log" Mar 19 15:18:36 crc kubenswrapper[4771]: I0319 15:18:36.285977 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" event={"ID":"bf31981b-d437-4216-a275-5b566d8c49aa","Type":"ContainerStarted","Data":"caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b"} Mar 19 15:18:36 crc kubenswrapper[4771]: I0319 15:18:36.288253 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:18:36 crc kubenswrapper[4771]: I0319 15:18:36.314327 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c7c8c95-d715-48ad-9244-c035db503075\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7c0836517f4ced7fafcb12ff073d4ef8885f18aec49a1e28404bb23103c42c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a796f2dbeb7d1bba7c52e85cb054117dbe63faa3482fa5cd9e604d11bc54f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a796f2dbeb7d1bba7c52e85cb054117dbe63faa3482fa5cd9e604d11bc54f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:36 crc kubenswrapper[4771]: I0319 15:18:36.343809 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:36 crc kubenswrapper[4771]: I0319 15:18:36.357896 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4658283-2a1d-4cda-8827-354317bcc677\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77527adccf33798bec152536c72459bb99ca3e53327a4c748d7544d99d5e7e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://809563b276b55898d5e5824345fbaf17170f5c3ca405104aa5973e91cb1293b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe2d7773b455689fcc7033bd53436cd5e99831482ede8da716f5470038b84250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:36 crc kubenswrapper[4771]: I0319 15:18:36.376542 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:36 crc kubenswrapper[4771]: I0319 15:18:36.391977 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:36 crc kubenswrapper[4771]: I0319 15:18:36.404642 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:36 crc kubenswrapper[4771]: I0319 15:18:36.420484 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d1b78b7b716ebb0981f773174244d9a9583d22d06c13ee70c3057304e406e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:36 crc kubenswrapper[4771]: I0319 15:18:36.461487 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7968c7aa-c5fd-4cd9-b265-c9b7527a86dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454b435a5eabdf4a64f4e08ed67ad3926ea1c357ff0bec140aaf1b21bfd6a5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96547efd86e809f3eeaecd398a7ce82d58b99e95ee29d5dc0a08abae402c28cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56b40f0d1b17c4f9997ef2713d4a5ecf59283afa491298222f7c6fef56fd2aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23395a26fca7d054420d5106512839bed7889bc626449e23186e85233c76a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950511c16fe7428ba42f181865b5f6d48c15a001fd40682fe33cb19fabb07482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8f745656bf82c5fd38635d48c5b7b34483a992c0e69091d391136f4f2a22b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de8f745656bf82c5fd38635d48c5b7b34483a992c0e69091d391136f4f2a22b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcd904e4f9d721ba401d6244137a4698ce5197dcc2daadb2e68014f0bca1ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdcd904e4f9d721ba401d6244137a4698ce5197dcc2daadb2e68014f0bca1ff9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6f6af54a4c433b2f583c08b1a7e50542298041f410c9190fd8f348b5275d8a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f6af54a4c433b2f583c08b1a7e50542298041f410c9190fd8f348b5275d8a08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:36 crc kubenswrapper[4771]: I0319 15:18:36.482632 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:36 crc kubenswrapper[4771]: I0319 15:18:36.505917 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:36 crc kubenswrapper[4771]: I0319 15:18:36.508229 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:36 crc kubenswrapper[4771]: E0319 15:18:36.508379 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:36 crc kubenswrapper[4771]: I0319 15:18:36.525596 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:36 crc kubenswrapper[4771]: I0319 15:18:36.543750 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:36 crc kubenswrapper[4771]: I0319 15:18:36.559136 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:36 crc kubenswrapper[4771]: I0319 15:18:36.574167 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2ebf37-b8db-4193-bdcb-dd9d10ba0b47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d83b8489925592a1793a8d1fdc5237fc52c2742f581701166f705577dd018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d88f1db3503c5fd2f12fc248d6274a805c322b793f1d6585968b39f5a461610e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:14Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 15:15:43.772545 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 15:15:43.775411 1 observer_polling.go:159] Starting file observer\\\\nI0319 15:15:43.816151 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 15:15:43.819414 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 15:16:14.124542 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:13Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc1650f4cc184b940a4fad0a9d7c1d593ece8735e59aed1c66f4417c2b862e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83324065bc3f8c142d9e97172aa6f22d07dc652071a0ed4365a449510d18b9e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5926799a8fe3e378dd794ebae2a622e7ea61fef68043085471dd9de44e40baa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:36 crc kubenswrapper[4771]: I0319 15:18:36.586185 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:36 crc kubenswrapper[4771]: I0319 15:18:36.597377 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce98359471e6ebc3c781c53f9143d8aedb0563c958c98591c755d9423ea41d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:36 crc kubenswrapper[4771]: I0319 15:18:36.610193 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f7ff3b147b68b53a4ab6e3fc6c7b1b5f1c61d11dc5bfab7b3d92a638fecbb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:18:23Z\\\",\\\"message\\\":\\\"2026-03-19T15:17:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_846f6fd3-ce25-45ab-911b-9edc4950aa85\\\\n2026-03-19T15:17:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_846f6fd3-ce25-45ab-911b-9edc4950aa85 to /host/opt/cni/bin/\\\\n2026-03-19T15:17:38Z [verbose] multus-daemon started\\\\n2026-03-19T15:17:38Z [verbose] Readiness Indicator file check\\\\n2026-03-19T15:18:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:36 crc kubenswrapper[4771]: I0319 15:18:36.630614 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://035bbdec0b1ce9c93570410f2d19b6644ac43f18bda883ba71be3874a485d8ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:18:08Z\\\",\\\"message\\\":\\\":18:08.381220 7037 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 15:18:08.381224 7037 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0319 15:18:08.381237 7037 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 15:18:08.381251 7037 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0319 15:18:08.381253 7037 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 15:18:08.381291 7037 factory.go:656] Stopping watch factory\\\\nI0319 15:18:08.381310 7037 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 15:18:08.381331 7037 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0319 15:18:08.381385 7037 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0319 15:18:08.381481 7037 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0319 15:18:08.381502 7037 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI0319 15:18:08.381511 7037 ovnkube.go:599] Stopped ovnkube\\\\nI0319 15:18:08.381515 7037 services_controller.go:360] Finished syncing service api on namespace openshift-apiserver for network=default : 1.95757ms\\\\nI0319 15:18:08.381534 7037 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0319 15:18:08.381608 7037 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:18:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:36 crc kubenswrapper[4771]: I0319 15:18:36.644433 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:36Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:36 crc kubenswrapper[4771]: E0319 15:18:36.645249 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 15:18:37 crc kubenswrapper[4771]: I0319 15:18:37.292603 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b6zx4_bf31981b-d437-4216-a275-5b566d8c49aa/ovnkube-controller/3.log" Mar 19 15:18:37 crc kubenswrapper[4771]: I0319 15:18:37.293605 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b6zx4_bf31981b-d437-4216-a275-5b566d8c49aa/ovnkube-controller/2.log" Mar 19 15:18:37 crc kubenswrapper[4771]: I0319 15:18:37.297475 4771 generic.go:334] "Generic (PLEG): container finished" podID="bf31981b-d437-4216-a275-5b566d8c49aa" containerID="caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b" exitCode=1 Mar 19 15:18:37 crc kubenswrapper[4771]: I0319 15:18:37.297534 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" event={"ID":"bf31981b-d437-4216-a275-5b566d8c49aa","Type":"ContainerDied","Data":"caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b"} Mar 19 15:18:37 crc kubenswrapper[4771]: I0319 15:18:37.297601 4771 scope.go:117] "RemoveContainer" containerID="035bbdec0b1ce9c93570410f2d19b6644ac43f18bda883ba71be3874a485d8ee" Mar 19 15:18:37 crc kubenswrapper[4771]: I0319 15:18:37.298972 4771 scope.go:117] "RemoveContainer" containerID="caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b" Mar 19 15:18:37 crc kubenswrapper[4771]: E0319 15:18:37.299369 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-b6zx4_openshift-ovn-kubernetes(bf31981b-d437-4216-a275-5b566d8c49aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" Mar 19 15:18:37 crc kubenswrapper[4771]: I0319 15:18:37.311425 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:37 crc kubenswrapper[4771]: I0319 15:18:37.325649 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:37 crc kubenswrapper[4771]: I0319 15:18:37.359563 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7968c7aa-c5fd-4cd9-b265-c9b7527a86dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454b435a5eabdf4a64f4e08ed67ad3926ea1c357ff0bec140aaf1b21bfd6a5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96547efd86e809f3eeaecd398a7ce82d58b99e95ee29d5dc0a08abae402c28cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56b40f0d1b17c4f9997ef2713d4a5ecf59283afa491298222f7c6fef56fd2aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23395a26fca7d054420d5106512839bed7889bc626449e23186e85233c76a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950511c16fe7428ba42f181865b5f6d48c15a001fd40682fe33cb19fabb07482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8f745656bf82c5fd38635d48c5b7b34483a992c0e69091d391136f4f2a22b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de8f745656bf82c5fd38635d48c5b7b34483a992c0e69091d391136f4f2a22b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcd904e4f9d721ba401d6244137a4698ce5197dcc2daadb2e68014f0bca1ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdcd904e4f9d721ba401d6244137a4698ce5197dcc2daadb2e68014f0bca1ff9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6f6af54a4c433b2f583c08b1a7e50542298041f410c9190fd8f348b5275d8a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f6af54a4c433b2f583c08b1a7e50542298041f410c9190fd8f348b5275d8a08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:37 crc kubenswrapper[4771]: I0319 15:18:37.378265 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:37 crc kubenswrapper[4771]: I0319 15:18:37.396214 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:37 crc kubenswrapper[4771]: I0319 15:18:37.413177 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:37 crc kubenswrapper[4771]: I0319 15:18:37.444245 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://035bbdec0b1ce9c93570410f2d19b6644ac43f18bda883ba71be3874a485d8ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:18:08Z\\\",\\\"message\\\":\\\":18:08.381220 7037 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 15:18:08.381224 7037 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0319 15:18:08.381237 7037 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 15:18:08.381251 7037 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0319 15:18:08.381253 7037 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 15:18:08.381291 7037 factory.go:656] Stopping watch factory\\\\nI0319 15:18:08.381310 7037 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 15:18:08.381331 7037 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0319 15:18:08.381385 7037 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0319 15:18:08.381481 7037 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0319 15:18:08.381502 7037 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI0319 15:18:08.381511 7037 ovnkube.go:599] Stopped ovnkube\\\\nI0319 15:18:08.381515 7037 services_controller.go:360] Finished syncing service api on namespace openshift-apiserver for network=default : 1.95757ms\\\\nI0319 15:18:08.381534 7037 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0319 15:18:08.381608 7037 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:18:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:18:36Z\\\",\\\"message\\\":\\\"{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0319 15:18:36.782401 7385 services_controller.go:452] Built service openshift-kube-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0319 15:18:36.782409 7385 services_controller.go:453] Built service openshift-kube-controller-manager-operator/metrics template LB for network=default: []services.LB{}\\\\nI0319 15:18:36.782409 7385 lb_config.go:1031] Cluster endpoints for openshift-marketplace/redhat-marketplace for network=default are: map[]\\\\nF0319 15:18:36.782418 7385 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:36Z\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:37 crc kubenswrapper[4771]: I0319 15:18:37.458772 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:37 crc kubenswrapper[4771]: I0319 15:18:37.475441 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2ebf37-b8db-4193-bdcb-dd9d10ba0b47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d83b8489925592a1793a8d1fdc5237fc52c2742f581701166f705577dd018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d88f1db3503c5fd2f12fc248d6274a805c322b793f1d6585968b39f5a461610e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:14Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 15:15:43.772545 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 15:15:43.775411 1 observer_polling.go:159] Starting file observer\\\\nI0319 15:15:43.816151 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 15:15:43.819414 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 15:16:14.124542 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:13Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc1650f4cc184b940a4fad0a9d7c1d593ece8735e59aed1c66f4417c2b862e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83324065bc3f8c142d9e97172aa6f22d07dc652071a0ed4365a449510d18b9e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5926799a8fe3e378dd794ebae2a622e7ea61fef68043085471dd9de44e40baa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:37 crc kubenswrapper[4771]: I0319 15:18:37.489952 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:37 crc kubenswrapper[4771]: I0319 15:18:37.505165 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce98359471e6ebc3c781c53f9143d8aedb0563c958c98591c755d9423ea41d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:37 crc kubenswrapper[4771]: I0319 15:18:37.508305 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:37 crc kubenswrapper[4771]: I0319 15:18:37.508301 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:37 crc kubenswrapper[4771]: I0319 15:18:37.508320 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:37 crc kubenswrapper[4771]: E0319 15:18:37.508533 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:18:37 crc kubenswrapper[4771]: E0319 15:18:37.508697 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:37 crc kubenswrapper[4771]: E0319 15:18:37.508875 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:37 crc kubenswrapper[4771]: I0319 15:18:37.526620 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f7ff3b147b68b53a4ab6e3fc6c7b1b5f1c61d11dc5bfab7b3d92a638fecbb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:18:23Z\\\",\\\"message\\\":\\\"2026-03-19T15:17:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_846f6fd3-ce25-45ab-911b-9edc4950aa85\\\\n2026-03-19T15:17:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_846f6fd3-ce25-45ab-911b-9edc4950aa85 to /host/opt/cni/bin/\\\\n2026-03-19T15:17:38Z [verbose] multus-daemon started\\\\n2026-03-19T15:17:38Z [verbose] Readiness Indicator file check\\\\n2026-03-19T15:18:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:37 crc kubenswrapper[4771]: I0319 15:18:37.543178 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:37 crc kubenswrapper[4771]: I0319 15:18:37.558702 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:37 crc kubenswrapper[4771]: I0319 15:18:37.581871 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d1b78b7b716ebb0981f773174244d9a9583d22d06c13ee70c3057304e406e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:37 crc kubenswrapper[4771]: I0319 15:18:37.596817 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c7c8c95-d715-48ad-9244-c035db503075\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7c0836517f4ced7fafcb12ff073d4ef8885f18aec49a1e28404bb23103c42c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a796f2dbeb7d1bba7c52e85cb054117dbe63faa3482fa5cd9e604d11bc54f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a796f2dbeb7d1bba7c52e85cb054117dbe63faa3482fa5cd9e604d11bc54f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:37 crc kubenswrapper[4771]: I0319 15:18:37.617351 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:37 crc kubenswrapper[4771]: I0319 15:18:37.631635 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4658283-2a1d-4cda-8827-354317bcc677\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77527adccf33798bec152536c72459bb99ca3e53327a4c748d7544d99d5e7e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://809563b276b55898d5e5824345fbaf17170f5c3ca405104aa5973e91cb1293b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe2d7773b455689fcc7033bd53436cd5e99831482ede8da716f5470038b84250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:37 crc kubenswrapper[4771]: I0319 15:18:37.653153 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:37Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:38 crc kubenswrapper[4771]: I0319 15:18:38.304314 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b6zx4_bf31981b-d437-4216-a275-5b566d8c49aa/ovnkube-controller/3.log" Mar 19 15:18:38 crc kubenswrapper[4771]: I0319 15:18:38.309104 4771 scope.go:117] "RemoveContainer" containerID="caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b" Mar 19 15:18:38 crc kubenswrapper[4771]: E0319 15:18:38.309540 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-b6zx4_openshift-ovn-kubernetes(bf31981b-d437-4216-a275-5b566d8c49aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" Mar 19 15:18:38 crc kubenswrapper[4771]: I0319 15:18:38.326094 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4658283-2a1d-4cda-8827-354317bcc677\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77527adccf33798bec152536c72459bb99ca3e53327a4c748d7544d99d5e7e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://809563b276b55898d5e5824345fbaf17170f5c3ca405104aa5973e91cb1293b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe2d7773b455689fcc7033bd53436cd5e99831482ede8da716f5470038b84250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:38 crc kubenswrapper[4771]: I0319 15:18:38.342690 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:38 crc kubenswrapper[4771]: I0319 15:18:38.363676 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:38 crc kubenswrapper[4771]: I0319 15:18:38.383408 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:38 crc kubenswrapper[4771]: I0319 15:18:38.405525 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d1b78b7b716ebb0981f773174244d9a9583d22d06c13ee70c3057304e406e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:38 crc kubenswrapper[4771]: I0319 15:18:38.416937 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c7c8c95-d715-48ad-9244-c035db503075\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7c0836517f4ced7fafcb12ff073d4ef8885f18aec49a1e28404bb23103c42c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a796f2dbeb7d1bba7c52e85cb054117dbe63faa3482fa5cd9e604d11bc54f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a796f2dbeb7d1bba7c52e85cb054117dbe63faa3482fa5cd9e604d11bc54f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:38 crc kubenswrapper[4771]: I0319 15:18:38.433864 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:38 crc kubenswrapper[4771]: I0319 15:18:38.452721 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:38 crc kubenswrapper[4771]: I0319 15:18:38.469513 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:38 crc kubenswrapper[4771]: I0319 15:18:38.483817 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:38 crc kubenswrapper[4771]: I0319 15:18:38.499699 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:38 crc kubenswrapper[4771]: I0319 15:18:38.507897 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:38 crc kubenswrapper[4771]: E0319 15:18:38.508091 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:38 crc kubenswrapper[4771]: I0319 15:18:38.522258 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7968c7aa-c5fd-4cd9-b265-c9b7527a86dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454b435a5eabdf4a64f4e08ed67ad3926ea1c357ff0bec140aaf1b21bfd6a5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96547efd86e809f3eeaecd398a7ce82d58b99e95ee29d5dc0a08abae402c28cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56b40f0d1b17c4f9997ef2713d4a5ecf59283afa491298222f7c6fef56fd2aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23395a26fca7d054420d5106512839bed7889bc626449e23186e85233c76a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950511c16fe7428ba42f181865b5f6d48c15a001fd40682fe33cb19fabb07482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8f745656bf82c5fd38635d48c5b7b34483a992c0e69091d391136f4f2a22b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de8f745656bf82c5fd38635d48c5b7b34483a992c0e69091d391136f4f2a22b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcd904e4f9d721ba401d6244137a4698ce5197dcc2daadb2e68014f0bca1ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdcd904e4f9d721ba401d6244137a4698ce5197dcc2daadb2e68014f0bca1ff9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6f6af54a4c433b2f583c08b1a7e50542298041f410c9190fd8f348b5275d8a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f6af54a4c433b2f583c08b1a7e50542298041f410c9190fd8f348b5275d8a08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:38 crc kubenswrapper[4771]: I0319 15:18:38.541179 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:38 crc kubenswrapper[4771]: I0319 15:18:38.558419 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce98359471e6ebc3c781c53f9143d8aedb0563c958c98591c755d9423ea41d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:38 crc kubenswrapper[4771]: I0319 15:18:38.574548 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f7ff3b147b68b53a4ab6e3fc6c7b1b5f1c61d11dc5bfab7b3d92a638fecbb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:18:23Z\\\",\\\"message\\\":\\\"2026-03-19T15:17:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_846f6fd3-ce25-45ab-911b-9edc4950aa85\\\\n2026-03-19T15:17:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_846f6fd3-ce25-45ab-911b-9edc4950aa85 to /host/opt/cni/bin/\\\\n2026-03-19T15:17:38Z [verbose] multus-daemon started\\\\n2026-03-19T15:17:38Z [verbose] Readiness Indicator file check\\\\n2026-03-19T15:18:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:38 crc kubenswrapper[4771]: I0319 15:18:38.594800 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:18:36Z\\\",\\\"message\\\":\\\"{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0319 15:18:36.782401 7385 services_controller.go:452] Built service openshift-kube-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0319 15:18:36.782409 7385 services_controller.go:453] Built service openshift-kube-controller-manager-operator/metrics template LB for network=default: []services.LB{}\\\\nI0319 15:18:36.782409 7385 lb_config.go:1031] Cluster endpoints for openshift-marketplace/redhat-marketplace for network=default are: map[]\\\\nF0319 15:18:36.782418 7385 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:36Z\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:18:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-b6zx4_openshift-ovn-kubernetes(bf31981b-d437-4216-a275-5b566d8c49aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:38 crc kubenswrapper[4771]: I0319 15:18:38.609076 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:38 crc kubenswrapper[4771]: I0319 15:18:38.623501 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2ebf37-b8db-4193-bdcb-dd9d10ba0b47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d83b8489925592a1793a8d1fdc5237fc52c2742f581701166f705577dd018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d88f1db3503c5fd2f12fc248d6274a805c322b793f1d6585968b39f5a461610e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:14Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 15:15:43.772545 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 15:15:43.775411 1 observer_polling.go:159] Starting file observer\\\\nI0319 15:15:43.816151 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 15:15:43.819414 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 15:16:14.124542 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:13Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc1650f4cc184b940a4fad0a9d7c1d593ece8735e59aed1c66f4417c2b862e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83324065bc3f8c142d9e97172aa6f22d07dc652071a0ed4365a449510d18b9e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5926799a8fe3e378dd794ebae2a622e7ea61fef68043085471dd9de44e40baa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:38 crc kubenswrapper[4771]: I0319 15:18:38.637704 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:38Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:39 crc kubenswrapper[4771]: I0319 15:18:39.474630 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs\") pod \"network-metrics-daemon-zjhnk\" (UID: \"7fb3bb21-b72b-45e1-9b87-73f281abba90\") " pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:39 crc kubenswrapper[4771]: E0319 15:18:39.474868 4771 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 15:18:39 crc kubenswrapper[4771]: E0319 15:18:39.474945 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs podName:7fb3bb21-b72b-45e1-9b87-73f281abba90 nodeName:}" failed. No retries permitted until 2026-03-19 15:19:43.474920552 +0000 UTC m=+242.703541784 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs") pod "network-metrics-daemon-zjhnk" (UID: "7fb3bb21-b72b-45e1-9b87-73f281abba90") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 15:18:39 crc kubenswrapper[4771]: I0319 15:18:39.508635 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:39 crc kubenswrapper[4771]: I0319 15:18:39.508759 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:39 crc kubenswrapper[4771]: I0319 15:18:39.508861 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:39 crc kubenswrapper[4771]: E0319 15:18:39.508855 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:39 crc kubenswrapper[4771]: E0319 15:18:39.508963 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:18:39 crc kubenswrapper[4771]: E0319 15:18:39.509162 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:40 crc kubenswrapper[4771]: I0319 15:18:40.507910 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:40 crc kubenswrapper[4771]: E0319 15:18:40.508084 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.293923 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.293977 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.294010 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.294031 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.294044 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:41Z","lastTransitionTime":"2026-03-19T15:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:41 crc kubenswrapper[4771]: E0319 15:18:41.309596 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.314285 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.314329 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.314345 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.314369 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.314387 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:41Z","lastTransitionTime":"2026-03-19T15:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:41 crc kubenswrapper[4771]: E0319 15:18:41.334092 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.338217 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.338291 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.338308 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.338329 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.338347 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:41Z","lastTransitionTime":"2026-03-19T15:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:41 crc kubenswrapper[4771]: E0319 15:18:41.359091 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.363436 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.363531 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.363555 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.363587 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.363610 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:41Z","lastTransitionTime":"2026-03-19T15:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:41 crc kubenswrapper[4771]: E0319 15:18:41.379323 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.386136 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.386219 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.386233 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.386254 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.386277 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:41Z","lastTransitionTime":"2026-03-19T15:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:41 crc kubenswrapper[4771]: E0319 15:18:41.398692 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:41 crc kubenswrapper[4771]: E0319 15:18:41.398926 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.508137 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.508224 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:41 crc kubenswrapper[4771]: E0319 15:18:41.508361 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.508397 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:41 crc kubenswrapper[4771]: E0319 15:18:41.508542 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:41 crc kubenswrapper[4771]: E0319 15:18:41.508719 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.536789 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:18:36Z\\\",\\\"message\\\":\\\"{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0319 15:18:36.782401 7385 services_controller.go:452] Built service openshift-kube-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0319 15:18:36.782409 7385 services_controller.go:453] Built service openshift-kube-controller-manager-operator/metrics template LB for network=default: []services.LB{}\\\\nI0319 15:18:36.782409 7385 lb_config.go:1031] Cluster endpoints for openshift-marketplace/redhat-marketplace for network=default are: map[]\\\\nF0319 15:18:36.782418 7385 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:36Z\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:18:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-b6zx4_openshift-ovn-kubernetes(bf31981b-d437-4216-a275-5b566d8c49aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.549509 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.565447 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2ebf37-b8db-4193-bdcb-dd9d10ba0b47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d83b8489925592a1793a8d1fdc5237fc52c2742f581701166f705577dd018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d88f1db3503c5fd2f12fc248d6274a805c322b793f1d6585968b39f5a461610e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:14Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 15:15:43.772545 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 15:15:43.775411 1 observer_polling.go:159] Starting file observer\\\\nI0319 15:15:43.816151 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 15:15:43.819414 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 15:16:14.124542 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:13Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc1650f4cc184b940a4fad0a9d7c1d593ece8735e59aed1c66f4417c2b862e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83324065bc3f8c142d9e97172aa6f22d07dc652071a0ed4365a449510d18b9e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5926799a8fe3e378dd794ebae2a622e7ea61fef68043085471dd9de44e40baa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.579617 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.594278 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce98359471e6ebc3c781c53f9143d8aedb0563c958c98591c755d9423ea41d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.606884 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f7ff3b147b68b53a4ab6e3fc6c7b1b5f1c61d11dc5bfab7b3d92a638fecbb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:18:23Z\\\",\\\"message\\\":\\\"2026-03-19T15:17:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_846f6fd3-ce25-45ab-911b-9edc4950aa85\\\\n2026-03-19T15:17:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_846f6fd3-ce25-45ab-911b-9edc4950aa85 to /host/opt/cni/bin/\\\\n2026-03-19T15:17:38Z [verbose] multus-daemon started\\\\n2026-03-19T15:17:38Z [verbose] Readiness Indicator file check\\\\n2026-03-19T15:18:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.619361 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.631848 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.646412 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d1b78b7b716ebb0981f773174244d9a9583d22d06c13ee70c3057304e406e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:41 crc kubenswrapper[4771]: E0319 15:18:41.647221 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.657947 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c7c8c95-d715-48ad-9244-c035db503075\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7c0836517f4ced7fafcb12ff073d4ef8885f18aec49a1e28404bb23103c42c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a796f2dbeb7d1bba7c52e85cb054117dbe63faa3482fa5cd9e604d11bc54f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a796f2dbeb7d1bba7c52e85cb054117dbe63faa3482fa5cd9e604d11bc54f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.674935 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.689182 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4658283-2a1d-4cda-8827-354317bcc677\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77527adccf33798bec152536c72459bb99ca3e53327a4c748d7544d99d5e7e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://809563b276b55898d5e5824345fbaf17170f5c3ca405104aa5973e91cb1293b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe2d7773b455689fcc7033bd53436cd5e99831482ede8da716f5470038b84250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.703440 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.714804 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.725088 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.752623 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7968c7aa-c5fd-4cd9-b265-c9b7527a86dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454b435a5eabdf4a64f4e08ed67ad3926ea1c357ff0bec140aaf1b21bfd6a5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96547efd86e809f3eeaecd398a7ce82d58b99e95ee29d5dc0a08abae402c28cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56b40f0d1b17c4f9997ef2713d4a5ecf59283afa491298222f7c6fef56fd2aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23395a26fca7d054420d5106512839bed7889bc626449e23186e85233c76a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950511c16fe7428ba42f181865b5f6d48c15a001fd40682fe33cb19fabb07482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8f745656bf82c5fd38635d48c5b7b34483a992c0e69091d391136f4f2a22b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de8f745656bf82c5fd38635d48c5b7b34483a992c0e69091d391136f4f2a22b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcd904e4f9d721ba401d6244137a4698ce5197dcc2daadb2e68014f0bca1ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdcd904e4f9d721ba401d6244137a4698ce5197dcc2daadb2e68014f0bca1ff9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6f6af54a4c433b2f583c08b1a7e50542298041f410c9190fd8f348b5275d8a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f6af54a4c433b2f583c08b1a7e50542298041f410c9190fd8f348b5275d8a08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.766949 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.780543 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:41 crc kubenswrapper[4771]: I0319 15:18:41.796216 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:41Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:42 crc kubenswrapper[4771]: I0319 15:18:42.508186 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:42 crc kubenswrapper[4771]: E0319 15:18:42.508688 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:43 crc kubenswrapper[4771]: I0319 15:18:43.510779 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:43 crc kubenswrapper[4771]: E0319 15:18:43.510905 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:43 crc kubenswrapper[4771]: I0319 15:18:43.510972 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:43 crc kubenswrapper[4771]: E0319 15:18:43.511041 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:18:43 crc kubenswrapper[4771]: I0319 15:18:43.511079 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:43 crc kubenswrapper[4771]: E0319 15:18:43.511125 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:44 crc kubenswrapper[4771]: I0319 15:18:44.508165 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:44 crc kubenswrapper[4771]: E0319 15:18:44.508323 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:45 crc kubenswrapper[4771]: I0319 15:18:45.508224 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:45 crc kubenswrapper[4771]: I0319 15:18:45.508266 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:45 crc kubenswrapper[4771]: I0319 15:18:45.508310 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:45 crc kubenswrapper[4771]: E0319 15:18:45.508390 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:18:45 crc kubenswrapper[4771]: E0319 15:18:45.508491 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:45 crc kubenswrapper[4771]: E0319 15:18:45.508576 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:46 crc kubenswrapper[4771]: I0319 15:18:46.507841 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:46 crc kubenswrapper[4771]: E0319 15:18:46.508002 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:46 crc kubenswrapper[4771]: E0319 15:18:46.648687 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 15:18:47 crc kubenswrapper[4771]: I0319 15:18:47.508521 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:47 crc kubenswrapper[4771]: I0319 15:18:47.508664 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:47 crc kubenswrapper[4771]: E0319 15:18:47.508727 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:47 crc kubenswrapper[4771]: I0319 15:18:47.508781 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:47 crc kubenswrapper[4771]: E0319 15:18:47.508940 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:47 crc kubenswrapper[4771]: E0319 15:18:47.509118 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:18:48 crc kubenswrapper[4771]: I0319 15:18:48.508591 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:48 crc kubenswrapper[4771]: E0319 15:18:48.509197 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:49 crc kubenswrapper[4771]: I0319 15:18:49.508240 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:49 crc kubenswrapper[4771]: I0319 15:18:49.508275 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:49 crc kubenswrapper[4771]: E0319 15:18:49.508346 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:49 crc kubenswrapper[4771]: E0319 15:18:49.508470 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:49 crc kubenswrapper[4771]: I0319 15:18:49.508287 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:49 crc kubenswrapper[4771]: E0319 15:18:49.508584 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:18:50 crc kubenswrapper[4771]: I0319 15:18:50.508681 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:50 crc kubenswrapper[4771]: E0319 15:18:50.509718 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:50 crc kubenswrapper[4771]: I0319 15:18:50.510312 4771 scope.go:117] "RemoveContainer" containerID="caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b" Mar 19 15:18:50 crc kubenswrapper[4771]: E0319 15:18:50.510645 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-b6zx4_openshift-ovn-kubernetes(bf31981b-d437-4216-a275-5b566d8c49aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.508265 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:51 crc kubenswrapper[4771]: E0319 15:18:51.508524 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.508280 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:51 crc kubenswrapper[4771]: E0319 15:18:51.508657 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.508281 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:51 crc kubenswrapper[4771]: E0319 15:18:51.508787 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.520834 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bde5c1-4714-4fff-bab9-3bbc84a71782\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3edbf9a6ce7f44fdc2552cd7513ed6392815a23ac35703a6ad947071e3f48ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac86dc8367a1e174793b8119c6ba82a65fc061569c7cdbe10645ba97d7ae535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v4k29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rgdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.535736 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fb3bb21-b72b-45e1-9b87-73f281abba90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5dzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjhnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.568433 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7968c7aa-c5fd-4cd9-b265-c9b7527a86dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454b435a5eabdf4a64f4e08ed67ad3926ea1c357ff0bec140aaf1b21bfd6a5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96547efd86e809f3eeaecd398a7ce82d58b99e95ee29d5dc0a08abae402c28cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56b40f0d1b17c4f9997ef2713d4a5ecf59283afa491298222f7c6fef56fd2aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23395a26fca7d054420d5106512839bed7889bc626449e23186e85233c76a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950511c16fe7428ba42f181865b5f6d48c15a001fd40682fe33cb19fabb07482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8f745656bf82c5fd38635d48c5b7b34483a992c0e69091d391136f4f2a22b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de8f745656bf82c5fd38635d48c5b7b34483a992c0e69091d391136f4f2a22b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcd904e4f9d721ba401d6244137a4698ce5197dcc2daadb2e68014f0bca1ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdcd904e4f9d721ba401d6244137a4698ce5197dcc2daadb2e68014f0bca1ff9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6f6af54a4c433b2f583c08b1a7e50542298041f410c9190fd8f348b5275d8a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f6af54a4c433b2f583c08b1a7e50542298041f410c9190fd8f348b5275d8a08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.593933 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb22dfd134e0177d6bfbc54f8153484523f1bf914f5ea39663d077ccd126482f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.614501 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.629117 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2baa45ddbdc23104ed9bc89ef28ad55d62e01ac4f2776bc3203214da6c0fe03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:51 crc kubenswrapper[4771]: E0319 15:18:51.650189 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.661666 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf31981b-d437-4216-a275-5b566d8c49aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:18:36Z\\\",\\\"message\\\":\\\"{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0319 15:18:36.782401 7385 services_controller.go:452] Built service openshift-kube-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0319 15:18:36.782409 7385 services_controller.go:453] Built service openshift-kube-controller-manager-operator/metrics template LB for network=default: []services.LB{}\\\\nI0319 15:18:36.782409 7385 lb_config.go:1031] Cluster endpoints for openshift-marketplace/redhat-marketplace for network=default are: map[]\\\\nF0319 15:18:36.782418 7385 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:36Z\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:18:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-b6zx4_openshift-ovn-kubernetes(bf31981b-d437-4216-a275-5b566d8c49aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hk5n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b6zx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.679311 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhmqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ee37da-7e5a-49de-bf2b-0857fa6f36b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a258947be7196dfb19ba7ba59a20b522b4af2eaa5e5c154bbac89dc243fb633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcnzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhmqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.696662 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2ebf37-b8db-4193-bdcb-dd9d10ba0b47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d83b8489925592a1793a8d1fdc5237fc52c2742f581701166f705577dd018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d88f1db3503c5fd2f12fc248d6274a805c322b793f1d6585968b39f5a461610e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:14Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 15:15:43.772545 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 15:15:43.775411 1 observer_polling.go:159] Starting file observer\\\\nI0319 15:15:43.816151 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 15:15:43.819414 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 15:16:14.124542 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:16:13Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc1650f4cc184b940a4fad0a9d7c1d593ece8735e59aed1c66f4417c2b862e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83324065bc3f8c142d9e97172aa6f22d07dc652071a0ed4365a449510d18b9e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5926799a8fe3e378dd794ebae2a622e7ea61fef68043085471dd9de44e40baa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.709982 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hg7b2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffd80ace-f6fe-4fd5-99c0-6dd74155d2fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9c76ef6ff080071046ee568fc0689ed7dd1a0f6305ba9e8805058d799f5a879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzjlt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hg7b2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.725125 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2b6e948-bbef-4217-b0eb-4cdbf711037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce98359471e6ebc3c781c53f9143d8aedb0563c958c98591c755d9423ea41d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgtqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wqbzp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.740963 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9989m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8c2de-454d-4b7c-bf30-2f5d12d7088e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f7ff3b147b68b53a4ab6e3fc6c7b1b5f1c61d11dc5bfab7b3d92a638fecbb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T15:18:23Z\\\",\\\"message\\\":\\\"2026-03-19T15:17:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_846f6fd3-ce25-45ab-911b-9edc4950aa85\\\\n2026-03-19T15:17:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_846f6fd3-ce25-45ab-911b-9edc4950aa85 to /host/opt/cni/bin/\\\\n2026-03-19T15:17:38Z [verbose] multus-daemon started\\\\n2026-03-19T15:17:38Z [verbose] Readiness Indicator file check\\\\n2026-03-19T15:18:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9989m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.756076 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb12deb7e00cf0682a22506e07b8f308e9ef058b27fd8af20c0032a24be593ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f467c4b9b610ab6ac2a30f0219fd44a9efa364ad6d7f9233ff8eb286aa61147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.769728 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.786508 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7afaaec8-b9d9-4b61-8bd2-3517ef7de1db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d1b78b7b716ebb0981f773174244d9a9583d22d06c13ee70c3057304e406e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4da53bfaa6d0dd080a777a47277b6495aff6068f3dd9b63a34d8b939f85bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5803dc2a94cc3cd9933e818f8e799c4253f15ac23b7d2b3401d16f1c71c8e469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc10ad86596a506058d0a324598e20e6b0909713e9589e85d0bbedb391ef435\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab44d66d42693606a2c8ef888c01b279aef6db28d6c8d4ce25698159ebadaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf9aa029bcbc3e99f1f9c11cf90ecc7a072faf93888ac773fd05169870a24a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7bf79d0b1bb11b6d62585491e13e67d085aac63597198fee907a52c2085358c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:17:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rt68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:17:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nmdkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.793665 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.793723 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.793741 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.793767 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.793785 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:51Z","lastTransitionTime":"2026-03-19T15:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.799545 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c7c8c95-d715-48ad-9244-c035db503075\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7c0836517f4ced7fafcb12ff073d4ef8885f18aec49a1e28404bb23103c42c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a796f2dbeb7d1bba7c52e85cb054117dbe63faa3482fa5cd9e604d11bc54f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a796f2dbeb7d1bba7c52e85cb054117dbe63faa3482fa5cd9e604d11bc54f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:51 crc kubenswrapper[4771]: E0319 15:18:51.809190 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.813345 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.813385 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.813395 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.813413 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.813425 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:51Z","lastTransitionTime":"2026-03-19T15:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.813639 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f231f29-5fc5-412c-ae86-574ab06a1fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T15:16:41Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 15:16:41.020744 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 15:16:41.020844 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 15:16:41.021416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1381006819/tls.crt::/tmp/serving-cert-1381006819/tls.key\\\\\\\"\\\\nI0319 15:16:41.538240 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 15:16:41.541144 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 15:16:41.541165 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 15:16:41.541183 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 15:16:41.541189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 15:16:41.547474 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 15:16:41.547500 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547505 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 15:16:41.547509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 15:16:41.547513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 15:16:41.547516 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 15:16:41.547519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 15:16:41.547590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 15:16:41.550098 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T15:16:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.828467 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4658283-2a1d-4cda-8827-354317bcc677\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T15:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77527adccf33798bec152536c72459bb99ca3e53327a4c748d7544d99d5e7e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://809563b276b55898d5e5824345fbaf17170f5c3ca405104aa5973e91cb1293b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe2d7773b455689fcc7033bd53436cd5e99831482ede8da716f5470038b84250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T15:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71e7e01d3b9ac7273d8c230bb84c403c9fd3468163e994221617fb084bc8b5bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T15:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T15:15:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T15:15:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:51 crc kubenswrapper[4771]: E0319 15:18:51.828820 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.832338 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.832364 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.832374 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.832392 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.832404 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:51Z","lastTransitionTime":"2026-03-19T15:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.842063 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T15:17:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:51 crc kubenswrapper[4771]: E0319 15:18:51.845925 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.849704 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.849751 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.849768 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.849792 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.849812 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:51Z","lastTransitionTime":"2026-03-19T15:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:51 crc kubenswrapper[4771]: E0319 15:18:51.863654 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.867162 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.867193 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.867202 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.867215 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:18:51 crc kubenswrapper[4771]: I0319 15:18:51.867225 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:18:51Z","lastTransitionTime":"2026-03-19T15:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:18:51 crc kubenswrapper[4771]: E0319 15:18:51.880769 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T15:18:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"03b7c304-29fa-4242-a7e5-f84ad5b17d5b\\\",\\\"systemUUID\\\":\\\"c77dd57c-51a3-4dec-a58e-8126c5679a04\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T15:18:51Z is after 2025-08-24T17:21:41Z" Mar 19 15:18:51 crc kubenswrapper[4771]: E0319 15:18:51.880878 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 15:18:52 crc kubenswrapper[4771]: I0319 15:18:52.507858 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:52 crc kubenswrapper[4771]: E0319 15:18:52.508612 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:53 crc kubenswrapper[4771]: I0319 15:18:53.508205 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:53 crc kubenswrapper[4771]: I0319 15:18:53.508268 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:53 crc kubenswrapper[4771]: I0319 15:18:53.508464 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:53 crc kubenswrapper[4771]: E0319 15:18:53.508626 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:18:53 crc kubenswrapper[4771]: E0319 15:18:53.508758 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:53 crc kubenswrapper[4771]: E0319 15:18:53.508952 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:54 crc kubenswrapper[4771]: I0319 15:18:54.508101 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:54 crc kubenswrapper[4771]: E0319 15:18:54.508595 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:55 crc kubenswrapper[4771]: I0319 15:18:55.508192 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:55 crc kubenswrapper[4771]: I0319 15:18:55.508258 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:55 crc kubenswrapper[4771]: I0319 15:18:55.508189 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:55 crc kubenswrapper[4771]: E0319 15:18:55.508433 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:55 crc kubenswrapper[4771]: E0319 15:18:55.508558 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:18:55 crc kubenswrapper[4771]: E0319 15:18:55.508727 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:56 crc kubenswrapper[4771]: I0319 15:18:56.507824 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:56 crc kubenswrapper[4771]: E0319 15:18:56.508081 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:56 crc kubenswrapper[4771]: E0319 15:18:56.651512 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 15:18:57 crc kubenswrapper[4771]: I0319 15:18:57.508581 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:57 crc kubenswrapper[4771]: I0319 15:18:57.508645 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:57 crc kubenswrapper[4771]: E0319 15:18:57.508904 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:57 crc kubenswrapper[4771]: I0319 15:18:57.508961 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:57 crc kubenswrapper[4771]: E0319 15:18:57.509119 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:57 crc kubenswrapper[4771]: E0319 15:18:57.509303 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:18:58 crc kubenswrapper[4771]: I0319 15:18:58.508240 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:18:58 crc kubenswrapper[4771]: E0319 15:18:58.508591 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:18:59 crc kubenswrapper[4771]: I0319 15:18:59.508225 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:18:59 crc kubenswrapper[4771]: E0319 15:18:59.508903 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:18:59 crc kubenswrapper[4771]: I0319 15:18:59.509216 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:18:59 crc kubenswrapper[4771]: I0319 15:18:59.509343 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:18:59 crc kubenswrapper[4771]: E0319 15:18:59.509955 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:18:59 crc kubenswrapper[4771]: E0319 15:18:59.510131 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:19:00 crc kubenswrapper[4771]: I0319 15:19:00.507627 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:19:00 crc kubenswrapper[4771]: E0319 15:19:00.508218 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:19:01 crc kubenswrapper[4771]: I0319 15:19:01.507605 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:19:01 crc kubenswrapper[4771]: E0319 15:19:01.507733 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:19:01 crc kubenswrapper[4771]: I0319 15:19:01.507819 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:19:01 crc kubenswrapper[4771]: I0319 15:19:01.508006 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:19:01 crc kubenswrapper[4771]: E0319 15:19:01.508247 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:19:01 crc kubenswrapper[4771]: E0319 15:19:01.508676 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:19:01 crc kubenswrapper[4771]: I0319 15:19:01.584063 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qhmqm" podStartSLOduration=142.584039971 podStartE2EDuration="2m22.584039971s" podCreationTimestamp="2026-03-19 15:16:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:01.56111518 +0000 UTC m=+200.789736372" watchObservedRunningTime="2026-03-19 15:19:01.584039971 +0000 UTC m=+200.812661173" Mar 19 15:19:01 crc kubenswrapper[4771]: I0319 15:19:01.599142 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hg7b2" podStartSLOduration=142.599116004 podStartE2EDuration="2m22.599116004s" podCreationTimestamp="2026-03-19 15:16:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:01.598446616 +0000 UTC m=+200.827067848" watchObservedRunningTime="2026-03-19 15:19:01.599116004 +0000 UTC m=+200.827737236" Mar 19 15:19:01 crc kubenswrapper[4771]: I0319 15:19:01.599504 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=57.599491783 podStartE2EDuration="57.599491783s" podCreationTimestamp="2026-03-19 15:18:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:01.585064388 +0000 UTC m=+200.813685590" watchObservedRunningTime="2026-03-19 15:19:01.599491783 +0000 UTC m=+200.828113025" Mar 19 15:19:01 crc kubenswrapper[4771]: I0319 15:19:01.611792 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podStartSLOduration=142.611772915 podStartE2EDuration="2m22.611772915s" podCreationTimestamp="2026-03-19 15:16:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:01.611691183 +0000 UTC m=+200.840312385" watchObservedRunningTime="2026-03-19 15:19:01.611772915 +0000 UTC m=+200.840394117" Mar 19 15:19:01 crc kubenswrapper[4771]: I0319 15:19:01.634867 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9989m" podStartSLOduration=142.634840979 podStartE2EDuration="2m22.634840979s" podCreationTimestamp="2026-03-19 15:16:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:01.634842839 +0000 UTC m=+200.863464071" watchObservedRunningTime="2026-03-19 15:19:01.634840979 +0000 UTC m=+200.863462191" Mar 19 15:19:01 crc kubenswrapper[4771]: E0319 15:19:01.652618 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 15:19:01 crc kubenswrapper[4771]: I0319 15:19:01.688634 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-nmdkf" podStartSLOduration=141.688613423 podStartE2EDuration="2m21.688613423s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:01.688558312 +0000 UTC m=+200.917179534" watchObservedRunningTime="2026-03-19 15:19:01.688613423 +0000 UTC m=+200.917234625" Mar 19 15:19:01 crc kubenswrapper[4771]: I0319 15:19:01.721189 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=106.721163898 podStartE2EDuration="1m46.721163898s" podCreationTimestamp="2026-03-19 15:17:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:01.721158538 +0000 UTC m=+200.949779750" watchObservedRunningTime="2026-03-19 15:19:01.721163898 +0000 UTC m=+200.949785140" Mar 19 15:19:01 crc kubenswrapper[4771]: I0319 15:19:01.721694 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=32.721683611 podStartE2EDuration="32.721683611s" podCreationTimestamp="2026-03-19 15:18:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:01.70032331 +0000 UTC m=+200.928944532" watchObservedRunningTime="2026-03-19 15:19:01.721683611 +0000 UTC m=+200.950304843" Mar 19 15:19:01 crc kubenswrapper[4771]: I0319 15:19:01.735612 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=70.735589683 podStartE2EDuration="1m10.735589683s" podCreationTimestamp="2026-03-19 15:17:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:01.735396159 +0000 UTC m=+200.964017361" watchObservedRunningTime="2026-03-19 15:19:01.735589683 +0000 UTC m=+200.964210885" Mar 19 15:19:01 crc kubenswrapper[4771]: I0319 15:19:01.775837 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rgdpp" podStartSLOduration=141.775819804 podStartE2EDuration="2m21.775819804s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:01.76305867 +0000 UTC m=+200.991679882" watchObservedRunningTime="2026-03-19 15:19:01.775819804 +0000 UTC m=+201.004441016" Mar 19 15:19:01 crc kubenswrapper[4771]: I0319 15:19:01.814901 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=50.814879084 podStartE2EDuration="50.814879084s" podCreationTimestamp="2026-03-19 15:18:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:01.800674874 +0000 UTC m=+201.029296076" watchObservedRunningTime="2026-03-19 15:19:01.814879084 +0000 UTC m=+201.043500286" Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.012681 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.012740 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.012757 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.012783 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.012805 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T15:19:02Z","lastTransitionTime":"2026-03-19T15:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.076859 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-s5wg4"] Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.077354 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s5wg4" Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.079570 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.079843 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.080084 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.081889 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.233081 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bd3803e8-2b3b-47a5-8ff6-f48ee3392784-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-s5wg4\" (UID: \"bd3803e8-2b3b-47a5-8ff6-f48ee3392784\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s5wg4" Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.233128 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd3803e8-2b3b-47a5-8ff6-f48ee3392784-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-s5wg4\" (UID: \"bd3803e8-2b3b-47a5-8ff6-f48ee3392784\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s5wg4" Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.233194 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd3803e8-2b3b-47a5-8ff6-f48ee3392784-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-s5wg4\" (UID: \"bd3803e8-2b3b-47a5-8ff6-f48ee3392784\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s5wg4" Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.233239 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bd3803e8-2b3b-47a5-8ff6-f48ee3392784-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-s5wg4\" (UID: \"bd3803e8-2b3b-47a5-8ff6-f48ee3392784\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s5wg4" Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.233263 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bd3803e8-2b3b-47a5-8ff6-f48ee3392784-service-ca\") pod \"cluster-version-operator-5c965bbfc6-s5wg4\" (UID: \"bd3803e8-2b3b-47a5-8ff6-f48ee3392784\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s5wg4" Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.334154 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bd3803e8-2b3b-47a5-8ff6-f48ee3392784-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-s5wg4\" (UID: \"bd3803e8-2b3b-47a5-8ff6-f48ee3392784\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s5wg4" Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.334230 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bd3803e8-2b3b-47a5-8ff6-f48ee3392784-service-ca\") pod \"cluster-version-operator-5c965bbfc6-s5wg4\" (UID: \"bd3803e8-2b3b-47a5-8ff6-f48ee3392784\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s5wg4" Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.334275 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bd3803e8-2b3b-47a5-8ff6-f48ee3392784-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-s5wg4\" (UID: \"bd3803e8-2b3b-47a5-8ff6-f48ee3392784\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s5wg4" Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.334296 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bd3803e8-2b3b-47a5-8ff6-f48ee3392784-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-s5wg4\" (UID: \"bd3803e8-2b3b-47a5-8ff6-f48ee3392784\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s5wg4" Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.334387 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bd3803e8-2b3b-47a5-8ff6-f48ee3392784-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-s5wg4\" (UID: \"bd3803e8-2b3b-47a5-8ff6-f48ee3392784\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s5wg4" Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.334313 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd3803e8-2b3b-47a5-8ff6-f48ee3392784-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-s5wg4\" (UID: \"bd3803e8-2b3b-47a5-8ff6-f48ee3392784\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s5wg4" Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.334593 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd3803e8-2b3b-47a5-8ff6-f48ee3392784-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-s5wg4\" (UID: \"bd3803e8-2b3b-47a5-8ff6-f48ee3392784\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s5wg4" Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.336903 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bd3803e8-2b3b-47a5-8ff6-f48ee3392784-service-ca\") pod \"cluster-version-operator-5c965bbfc6-s5wg4\" (UID: \"bd3803e8-2b3b-47a5-8ff6-f48ee3392784\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s5wg4" Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.343205 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd3803e8-2b3b-47a5-8ff6-f48ee3392784-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-s5wg4\" (UID: \"bd3803e8-2b3b-47a5-8ff6-f48ee3392784\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s5wg4" Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.351798 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd3803e8-2b3b-47a5-8ff6-f48ee3392784-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-s5wg4\" (UID: \"bd3803e8-2b3b-47a5-8ff6-f48ee3392784\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s5wg4" Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.397866 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s5wg4" Mar 19 15:19:02 crc kubenswrapper[4771]: W0319 15:19:02.418880 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd3803e8_2b3b_47a5_8ff6_f48ee3392784.slice/crio-40530a3a25dfe8dcd1e9c40788fdd9903b08d71b04c659a500d31e299e50a339 WatchSource:0}: Error finding container 40530a3a25dfe8dcd1e9c40788fdd9903b08d71b04c659a500d31e299e50a339: Status 404 returned error can't find the container with id 40530a3a25dfe8dcd1e9c40788fdd9903b08d71b04c659a500d31e299e50a339 Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.508710 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:19:02 crc kubenswrapper[4771]: E0319 15:19:02.508908 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.598346 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.607401 4771 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.964811 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s5wg4" event={"ID":"bd3803e8-2b3b-47a5-8ff6-f48ee3392784","Type":"ContainerStarted","Data":"ccc0bcc6ee7a878b3a67e241f97454ab6be0ab6c615d97d6ce17d6e7d5382f92"} Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.964901 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s5wg4" event={"ID":"bd3803e8-2b3b-47a5-8ff6-f48ee3392784","Type":"ContainerStarted","Data":"40530a3a25dfe8dcd1e9c40788fdd9903b08d71b04c659a500d31e299e50a339"} Mar 19 15:19:02 crc kubenswrapper[4771]: I0319 15:19:02.983463 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s5wg4" podStartSLOduration=143.983434101 podStartE2EDuration="2m23.983434101s" podCreationTimestamp="2026-03-19 15:16:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:02.982673962 +0000 UTC m=+202.211295164" watchObservedRunningTime="2026-03-19 15:19:02.983434101 +0000 UTC m=+202.212055333" Mar 19 15:19:03 crc kubenswrapper[4771]: I0319 15:19:03.508593 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:19:03 crc kubenswrapper[4771]: I0319 15:19:03.508694 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:19:03 crc kubenswrapper[4771]: I0319 15:19:03.508762 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:19:03 crc kubenswrapper[4771]: E0319 15:19:03.509508 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:19:03 crc kubenswrapper[4771]: E0319 15:19:03.509857 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:19:03 crc kubenswrapper[4771]: E0319 15:19:03.510042 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:19:04 crc kubenswrapper[4771]: I0319 15:19:04.508377 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:19:04 crc kubenswrapper[4771]: E0319 15:19:04.508575 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:19:04 crc kubenswrapper[4771]: I0319 15:19:04.509865 4771 scope.go:117] "RemoveContainer" containerID="caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b" Mar 19 15:19:04 crc kubenswrapper[4771]: E0319 15:19:04.510145 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-b6zx4_openshift-ovn-kubernetes(bf31981b-d437-4216-a275-5b566d8c49aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" Mar 19 15:19:05 crc kubenswrapper[4771]: I0319 15:19:05.508596 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:19:05 crc kubenswrapper[4771]: I0319 15:19:05.508614 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:19:05 crc kubenswrapper[4771]: E0319 15:19:05.509317 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:19:05 crc kubenswrapper[4771]: I0319 15:19:05.508681 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:19:05 crc kubenswrapper[4771]: E0319 15:19:05.510200 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:19:05 crc kubenswrapper[4771]: E0319 15:19:05.509319 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:19:06 crc kubenswrapper[4771]: I0319 15:19:06.508704 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:19:06 crc kubenswrapper[4771]: E0319 15:19:06.509364 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:19:06 crc kubenswrapper[4771]: E0319 15:19:06.654579 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 15:19:07 crc kubenswrapper[4771]: I0319 15:19:07.508776 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:19:07 crc kubenswrapper[4771]: E0319 15:19:07.508973 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:19:07 crc kubenswrapper[4771]: I0319 15:19:07.509224 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:19:07 crc kubenswrapper[4771]: E0319 15:19:07.509292 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:19:07 crc kubenswrapper[4771]: I0319 15:19:07.510259 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:19:07 crc kubenswrapper[4771]: E0319 15:19:07.510345 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:19:08 crc kubenswrapper[4771]: I0319 15:19:08.508433 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:19:08 crc kubenswrapper[4771]: E0319 15:19:08.508635 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:19:09 crc kubenswrapper[4771]: I0319 15:19:09.508766 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:19:09 crc kubenswrapper[4771]: I0319 15:19:09.508840 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:19:09 crc kubenswrapper[4771]: I0319 15:19:09.508813 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:19:09 crc kubenswrapper[4771]: E0319 15:19:09.509080 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:19:09 crc kubenswrapper[4771]: E0319 15:19:09.509284 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:19:09 crc kubenswrapper[4771]: E0319 15:19:09.509412 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:19:09 crc kubenswrapper[4771]: I0319 15:19:09.996023 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9989m_51f8c2de-454d-4b7c-bf30-2f5d12d7088e/kube-multus/1.log" Mar 19 15:19:10 crc kubenswrapper[4771]: I0319 15:19:10.000598 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9989m_51f8c2de-454d-4b7c-bf30-2f5d12d7088e/kube-multus/0.log" Mar 19 15:19:10 crc kubenswrapper[4771]: I0319 15:19:10.000685 4771 generic.go:334] "Generic (PLEG): container finished" podID="51f8c2de-454d-4b7c-bf30-2f5d12d7088e" containerID="65f7ff3b147b68b53a4ab6e3fc6c7b1b5f1c61d11dc5bfab7b3d92a638fecbb2" exitCode=1 Mar 19 15:19:10 crc kubenswrapper[4771]: I0319 15:19:10.000742 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9989m" event={"ID":"51f8c2de-454d-4b7c-bf30-2f5d12d7088e","Type":"ContainerDied","Data":"65f7ff3b147b68b53a4ab6e3fc6c7b1b5f1c61d11dc5bfab7b3d92a638fecbb2"} Mar 19 15:19:10 crc kubenswrapper[4771]: I0319 15:19:10.000811 4771 scope.go:117] "RemoveContainer" containerID="b0bec1d147115df21e792c5c425fff977a9cc6328d9fb21b1c0cb9509e8e3ad9" Mar 19 15:19:10 crc kubenswrapper[4771]: I0319 15:19:10.001905 4771 scope.go:117] "RemoveContainer" containerID="65f7ff3b147b68b53a4ab6e3fc6c7b1b5f1c61d11dc5bfab7b3d92a638fecbb2" Mar 19 15:19:10 crc kubenswrapper[4771]: E0319 15:19:10.002431 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-9989m_openshift-multus(51f8c2de-454d-4b7c-bf30-2f5d12d7088e)\"" pod="openshift-multus/multus-9989m" podUID="51f8c2de-454d-4b7c-bf30-2f5d12d7088e" Mar 19 15:19:10 crc kubenswrapper[4771]: I0319 15:19:10.508487 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:19:10 crc kubenswrapper[4771]: E0319 15:19:10.508736 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:19:11 crc kubenswrapper[4771]: I0319 15:19:11.007311 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9989m_51f8c2de-454d-4b7c-bf30-2f5d12d7088e/kube-multus/1.log" Mar 19 15:19:11 crc kubenswrapper[4771]: I0319 15:19:11.508173 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:19:11 crc kubenswrapper[4771]: I0319 15:19:11.508199 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:19:11 crc kubenswrapper[4771]: I0319 15:19:11.508515 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:19:11 crc kubenswrapper[4771]: E0319 15:19:11.510406 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:19:11 crc kubenswrapper[4771]: E0319 15:19:11.511032 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:19:11 crc kubenswrapper[4771]: E0319 15:19:11.511168 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:19:11 crc kubenswrapper[4771]: E0319 15:19:11.655559 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 15:19:12 crc kubenswrapper[4771]: I0319 15:19:12.507808 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:19:12 crc kubenswrapper[4771]: E0319 15:19:12.508060 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:19:13 crc kubenswrapper[4771]: I0319 15:19:13.508596 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:19:13 crc kubenswrapper[4771]: I0319 15:19:13.508693 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:19:13 crc kubenswrapper[4771]: E0319 15:19:13.508732 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:19:13 crc kubenswrapper[4771]: I0319 15:19:13.508541 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:19:13 crc kubenswrapper[4771]: E0319 15:19:13.508919 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:19:13 crc kubenswrapper[4771]: E0319 15:19:13.508943 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:19:14 crc kubenswrapper[4771]: I0319 15:19:14.507597 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:19:14 crc kubenswrapper[4771]: E0319 15:19:14.507761 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:19:15 crc kubenswrapper[4771]: I0319 15:19:15.508718 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:19:15 crc kubenswrapper[4771]: I0319 15:19:15.508846 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:19:15 crc kubenswrapper[4771]: E0319 15:19:15.508881 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:19:15 crc kubenswrapper[4771]: I0319 15:19:15.508719 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:19:15 crc kubenswrapper[4771]: E0319 15:19:15.509394 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:19:15 crc kubenswrapper[4771]: E0319 15:19:15.509527 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:19:15 crc kubenswrapper[4771]: I0319 15:19:15.509982 4771 scope.go:117] "RemoveContainer" containerID="caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b" Mar 19 15:19:15 crc kubenswrapper[4771]: E0319 15:19:15.510422 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-b6zx4_openshift-ovn-kubernetes(bf31981b-d437-4216-a275-5b566d8c49aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" Mar 19 15:19:16 crc kubenswrapper[4771]: I0319 15:19:16.508043 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:19:16 crc kubenswrapper[4771]: E0319 15:19:16.508249 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:19:16 crc kubenswrapper[4771]: E0319 15:19:16.656973 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 15:19:17 crc kubenswrapper[4771]: I0319 15:19:17.508475 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:19:17 crc kubenswrapper[4771]: I0319 15:19:17.508497 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:19:17 crc kubenswrapper[4771]: E0319 15:19:17.508770 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:19:17 crc kubenswrapper[4771]: I0319 15:19:17.508499 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:19:17 crc kubenswrapper[4771]: E0319 15:19:17.509014 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:19:17 crc kubenswrapper[4771]: E0319 15:19:17.509093 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:19:18 crc kubenswrapper[4771]: I0319 15:19:18.508369 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:19:18 crc kubenswrapper[4771]: E0319 15:19:18.508634 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:19:19 crc kubenswrapper[4771]: I0319 15:19:19.508064 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:19:19 crc kubenswrapper[4771]: I0319 15:19:19.508150 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:19:19 crc kubenswrapper[4771]: I0319 15:19:19.508064 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:19:19 crc kubenswrapper[4771]: E0319 15:19:19.508236 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:19:19 crc kubenswrapper[4771]: E0319 15:19:19.508317 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:19:19 crc kubenswrapper[4771]: E0319 15:19:19.508469 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:19:20 crc kubenswrapper[4771]: I0319 15:19:20.508508 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:19:20 crc kubenswrapper[4771]: E0319 15:19:20.508657 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:19:21 crc kubenswrapper[4771]: I0319 15:19:21.508866 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:19:21 crc kubenswrapper[4771]: I0319 15:19:21.508912 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:19:21 crc kubenswrapper[4771]: E0319 15:19:21.511229 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:19:21 crc kubenswrapper[4771]: I0319 15:19:21.511310 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:19:21 crc kubenswrapper[4771]: E0319 15:19:21.511407 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:19:21 crc kubenswrapper[4771]: E0319 15:19:21.511561 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:19:21 crc kubenswrapper[4771]: E0319 15:19:21.657649 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 15:19:22 crc kubenswrapper[4771]: I0319 15:19:22.508578 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:19:22 crc kubenswrapper[4771]: I0319 15:19:22.508760 4771 scope.go:117] "RemoveContainer" containerID="65f7ff3b147b68b53a4ab6e3fc6c7b1b5f1c61d11dc5bfab7b3d92a638fecbb2" Mar 19 15:19:22 crc kubenswrapper[4771]: E0319 15:19:22.508790 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:19:23 crc kubenswrapper[4771]: I0319 15:19:23.056639 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9989m_51f8c2de-454d-4b7c-bf30-2f5d12d7088e/kube-multus/1.log" Mar 19 15:19:23 crc kubenswrapper[4771]: I0319 15:19:23.056720 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9989m" event={"ID":"51f8c2de-454d-4b7c-bf30-2f5d12d7088e","Type":"ContainerStarted","Data":"3bd5d8865766ecf282dd1c1331a00385cc5afafe9aeec835b90a86519234b1ab"} Mar 19 15:19:23 crc kubenswrapper[4771]: I0319 15:19:23.385401 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:23 crc kubenswrapper[4771]: I0319 15:19:23.385545 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:19:23 crc kubenswrapper[4771]: E0319 15:19:23.385687 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 15:19:23 crc kubenswrapper[4771]: E0319 15:19:23.385695 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:21:25.385655751 +0000 UTC m=+344.614276983 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:23 crc kubenswrapper[4771]: E0319 15:19:23.385763 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 15:21:25.385740173 +0000 UTC m=+344.614361415 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 15:19:23 crc kubenswrapper[4771]: I0319 15:19:23.385799 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:19:23 crc kubenswrapper[4771]: E0319 15:19:23.386115 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 15:19:23 crc kubenswrapper[4771]: E0319 15:19:23.386226 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 15:21:25.386206344 +0000 UTC m=+344.614827576 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 15:19:23 crc kubenswrapper[4771]: I0319 15:19:23.487445 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:19:23 crc kubenswrapper[4771]: I0319 15:19:23.487515 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:19:23 crc kubenswrapper[4771]: E0319 15:19:23.487723 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 15:19:23 crc kubenswrapper[4771]: E0319 15:19:23.487752 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 15:19:23 crc kubenswrapper[4771]: E0319 15:19:23.487774 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 15:19:23 crc kubenswrapper[4771]: E0319 15:19:23.487793 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:19:23 crc kubenswrapper[4771]: E0319 15:19:23.487794 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 15:19:23 crc kubenswrapper[4771]: E0319 15:19:23.487820 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:19:23 crc kubenswrapper[4771]: E0319 15:19:23.487863 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 15:21:25.487843912 +0000 UTC m=+344.716465174 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:19:23 crc kubenswrapper[4771]: E0319 15:19:23.487897 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 15:21:25.487872973 +0000 UTC m=+344.716494205 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 15:19:23 crc kubenswrapper[4771]: I0319 15:19:23.508510 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:19:23 crc kubenswrapper[4771]: I0319 15:19:23.508590 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:19:23 crc kubenswrapper[4771]: I0319 15:19:23.508662 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:19:23 crc kubenswrapper[4771]: E0319 15:19:23.508687 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:19:23 crc kubenswrapper[4771]: E0319 15:19:23.508779 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:19:23 crc kubenswrapper[4771]: E0319 15:19:23.508895 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:19:24 crc kubenswrapper[4771]: I0319 15:19:24.507909 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:19:24 crc kubenswrapper[4771]: E0319 15:19:24.508087 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:19:25 crc kubenswrapper[4771]: I0319 15:19:25.508739 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:19:25 crc kubenswrapper[4771]: I0319 15:19:25.508783 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:19:25 crc kubenswrapper[4771]: E0319 15:19:25.509025 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:19:25 crc kubenswrapper[4771]: I0319 15:19:25.509090 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:19:25 crc kubenswrapper[4771]: E0319 15:19:25.509435 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:19:25 crc kubenswrapper[4771]: E0319 15:19:25.509636 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:19:26 crc kubenswrapper[4771]: I0319 15:19:26.508243 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:19:26 crc kubenswrapper[4771]: E0319 15:19:26.508446 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:19:26 crc kubenswrapper[4771]: E0319 15:19:26.658644 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 15:19:27 crc kubenswrapper[4771]: I0319 15:19:27.508451 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:19:27 crc kubenswrapper[4771]: I0319 15:19:27.508561 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:19:27 crc kubenswrapper[4771]: E0319 15:19:27.508638 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:19:27 crc kubenswrapper[4771]: I0319 15:19:27.508668 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:19:27 crc kubenswrapper[4771]: E0319 15:19:27.508828 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:19:27 crc kubenswrapper[4771]: E0319 15:19:27.508944 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:19:28 crc kubenswrapper[4771]: I0319 15:19:28.508255 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:19:28 crc kubenswrapper[4771]: E0319 15:19:28.508471 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:19:29 crc kubenswrapper[4771]: I0319 15:19:29.508124 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:19:29 crc kubenswrapper[4771]: I0319 15:19:29.508193 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:19:29 crc kubenswrapper[4771]: I0319 15:19:29.508276 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:19:29 crc kubenswrapper[4771]: E0319 15:19:29.508343 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:19:29 crc kubenswrapper[4771]: E0319 15:19:29.508516 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:19:29 crc kubenswrapper[4771]: E0319 15:19:29.508680 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:19:30 crc kubenswrapper[4771]: I0319 15:19:30.508378 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:19:30 crc kubenswrapper[4771]: E0319 15:19:30.508655 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:19:30 crc kubenswrapper[4771]: I0319 15:19:30.509655 4771 scope.go:117] "RemoveContainer" containerID="caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b" Mar 19 15:19:31 crc kubenswrapper[4771]: I0319 15:19:31.084105 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b6zx4_bf31981b-d437-4216-a275-5b566d8c49aa/ovnkube-controller/3.log" Mar 19 15:19:31 crc kubenswrapper[4771]: I0319 15:19:31.087532 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" event={"ID":"bf31981b-d437-4216-a275-5b566d8c49aa","Type":"ContainerStarted","Data":"bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d"} Mar 19 15:19:31 crc kubenswrapper[4771]: I0319 15:19:31.088209 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:19:31 crc kubenswrapper[4771]: I0319 15:19:31.121227 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" podStartSLOduration=171.121202278 podStartE2EDuration="2m51.121202278s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:31.11932291 +0000 UTC m=+230.347944152" watchObservedRunningTime="2026-03-19 15:19:31.121202278 +0000 UTC m=+230.349823480" Mar 19 15:19:31 crc kubenswrapper[4771]: I0319 15:19:31.296526 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zjhnk"] Mar 19 15:19:31 crc kubenswrapper[4771]: I0319 15:19:31.296711 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:19:31 crc kubenswrapper[4771]: E0319 15:19:31.296910 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:19:31 crc kubenswrapper[4771]: I0319 15:19:31.508454 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:19:31 crc kubenswrapper[4771]: I0319 15:19:31.508898 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:19:31 crc kubenswrapper[4771]: E0319 15:19:31.509784 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:19:31 crc kubenswrapper[4771]: E0319 15:19:31.510047 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:19:31 crc kubenswrapper[4771]: E0319 15:19:31.659722 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 15:19:32 crc kubenswrapper[4771]: I0319 15:19:32.507964 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:19:32 crc kubenswrapper[4771]: E0319 15:19:32.508155 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:19:33 crc kubenswrapper[4771]: I0319 15:19:33.507800 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:19:33 crc kubenswrapper[4771]: I0319 15:19:33.507893 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:19:33 crc kubenswrapper[4771]: I0319 15:19:33.507905 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:19:33 crc kubenswrapper[4771]: E0319 15:19:33.509433 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:19:33 crc kubenswrapper[4771]: E0319 15:19:33.509533 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:19:33 crc kubenswrapper[4771]: E0319 15:19:33.509616 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:19:34 crc kubenswrapper[4771]: I0319 15:19:34.508553 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:19:34 crc kubenswrapper[4771]: E0319 15:19:34.509103 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:19:35 crc kubenswrapper[4771]: I0319 15:19:35.508626 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:19:35 crc kubenswrapper[4771]: I0319 15:19:35.508733 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:19:35 crc kubenswrapper[4771]: I0319 15:19:35.508733 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:19:35 crc kubenswrapper[4771]: E0319 15:19:35.508915 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:19:35 crc kubenswrapper[4771]: E0319 15:19:35.509141 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:19:35 crc kubenswrapper[4771]: E0319 15:19:35.509238 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:19:36 crc kubenswrapper[4771]: I0319 15:19:36.508523 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:19:36 crc kubenswrapper[4771]: E0319 15:19:36.508749 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:19:37 crc kubenswrapper[4771]: I0319 15:19:37.507753 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:19:37 crc kubenswrapper[4771]: I0319 15:19:37.507893 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:19:37 crc kubenswrapper[4771]: I0319 15:19:37.507958 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:19:37 crc kubenswrapper[4771]: I0319 15:19:37.510478 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 15:19:37 crc kubenswrapper[4771]: I0319 15:19:37.510961 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 15:19:37 crc kubenswrapper[4771]: I0319 15:19:37.512647 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 19 15:19:37 crc kubenswrapper[4771]: I0319 15:19:37.512829 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 15:19:38 crc kubenswrapper[4771]: I0319 15:19:38.508128 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:19:38 crc kubenswrapper[4771]: I0319 15:19:38.510558 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 15:19:38 crc kubenswrapper[4771]: I0319 15:19:38.510698 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.860790 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.902650 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4"] Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.903317 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.903512 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gvngw"] Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.904447 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gvngw" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.905216 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-2mx6f"] Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.906278 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rn8sc"] Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.912221 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-2mx6f" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.913329 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.913391 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.914389 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.914692 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.914806 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.915031 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.915408 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.919385 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.920651 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-sl682"] Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.920780 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.921780 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.922302 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jpwl9"] Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.923918 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp"] Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.924406 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.924913 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.925063 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jpwl9" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.927858 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-h97xq"] Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.931623 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-sl682" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.942219 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-h97xq" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.948225 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-fjh9n"] Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.949016 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wwpj8"] Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.949700 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wwpj8" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.950332 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjh9n" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.950421 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.950442 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwdj8"] Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.950596 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.950776 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.950802 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.950973 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.951025 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.951175 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.951190 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.951175 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwdj8" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.951333 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.951392 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.951599 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.951685 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.951742 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.951751 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.951823 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.951854 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.951899 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.953091 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.953441 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.959636 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.959766 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.968093 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.970192 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8827r"] Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.970947 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-8827r" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.971953 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-m25b8"] Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.972293 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.972407 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-m25b8" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.973207 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.973640 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.973821 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.973942 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.974095 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.974616 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.974773 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.975072 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.975170 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.975237 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.975414 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.975486 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.975531 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.975757 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.975803 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.975896 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.976063 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.976268 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.976302 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.976455 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.976568 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.976603 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.977022 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.977145 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.977249 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.980420 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9sf6z"] Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.981274 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9sf6z" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.981530 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.984161 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.984380 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fk4xm"] Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.985196 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fk4xm" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.991068 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fs46x"] Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.991150 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.991782 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-fvhqg"] Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.991829 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.992011 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.992067 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.992105 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.992190 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.992286 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-fvhqg" Mar 19 15:19:42 crc kubenswrapper[4771]: I0319 15:19:42.992358 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fs46x" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.009442 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.010232 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.013950 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.014453 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.015250 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.015854 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.016428 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.016525 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.016932 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.017335 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vdfcr"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.017351 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.017581 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.017745 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.017776 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.017806 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.017940 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.018851 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.019338 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.019721 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.033748 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vdfcr" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.061304 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.061498 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.062704 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.062902 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.063141 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qj9qq"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.063847 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065033 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d94aad87-f9aa-4c6b-9846-509b3c6ad6b5-machine-approver-tls\") pod \"machine-approver-56656f9798-fjh9n\" (UID: \"d94aad87-f9aa-4c6b-9846-509b3c6ad6b5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjh9n" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065060 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d43df1e2-591a-43e2-a7e2-f48459125711-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-gvngw\" (UID: \"d43df1e2-591a-43e2-a7e2-f48459125711\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gvngw" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065082 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b11452b4-794b-41a4-a700-0b541916c6ad-encryption-config\") pod \"apiserver-7bbb656c7d-9qkhp\" (UID: \"b11452b4-794b-41a4-a700-0b541916c6ad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065102 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91fbe793-06cf-41b4-b24b-a57657dc05f2-config\") pod \"authentication-operator-69f744f599-sl682\" (UID: \"91fbe793-06cf-41b4-b24b-a57657dc05f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sl682" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065118 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1c75a371-7547-47b8-ac59-d296d642cd5c-etcd-client\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065135 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp5v6\" (UniqueName: \"kubernetes.io/projected/91fbe793-06cf-41b4-b24b-a57657dc05f2-kube-api-access-xp5v6\") pod \"authentication-operator-69f744f599-sl682\" (UID: \"91fbe793-06cf-41b4-b24b-a57657dc05f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sl682" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065152 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpx66\" (UniqueName: \"kubernetes.io/projected/8169e27a-4573-4d03-b3e8-0c072a2efbe7-kube-api-access-dpx66\") pod \"console-operator-58897d9998-wwpj8\" (UID: \"8169e27a-4573-4d03-b3e8-0c072a2efbe7\") " pod="openshift-console-operator/console-operator-58897d9998-wwpj8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065168 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8169e27a-4573-4d03-b3e8-0c072a2efbe7-config\") pod \"console-operator-58897d9998-wwpj8\" (UID: \"8169e27a-4573-4d03-b3e8-0c072a2efbe7\") " pod="openshift-console-operator/console-operator-58897d9998-wwpj8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065185 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b11452b4-794b-41a4-a700-0b541916c6ad-serving-cert\") pod \"apiserver-7bbb656c7d-9qkhp\" (UID: \"b11452b4-794b-41a4-a700-0b541916c6ad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065202 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c75a371-7547-47b8-ac59-d296d642cd5c-config\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065219 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f5ae0ccc-a50b-46d1-b887-28840703ab87-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-2mx6f\" (UID: \"f5ae0ccc-a50b-46d1-b887-28840703ab87\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2mx6f" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065236 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/93121b3f-c7da-4e21-8fa0-7f8bf08a05ef-client-ca\") pod \"route-controller-manager-6576b87f9c-hmvx4\" (UID: \"93121b3f-c7da-4e21-8fa0-7f8bf08a05ef\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065271 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f5ae0ccc-a50b-46d1-b887-28840703ab87-images\") pod \"machine-api-operator-5694c8668f-2mx6f\" (UID: \"f5ae0ccc-a50b-46d1-b887-28840703ab87\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2mx6f" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065478 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8169e27a-4573-4d03-b3e8-0c072a2efbe7-trusted-ca\") pod \"console-operator-58897d9998-wwpj8\" (UID: \"8169e27a-4573-4d03-b3e8-0c072a2efbe7\") " pod="openshift-console-operator/console-operator-58897d9998-wwpj8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065495 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91fbe793-06cf-41b4-b24b-a57657dc05f2-serving-cert\") pod \"authentication-operator-69f744f599-sl682\" (UID: \"91fbe793-06cf-41b4-b24b-a57657dc05f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sl682" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065509 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d43df1e2-591a-43e2-a7e2-f48459125711-config\") pod \"openshift-apiserver-operator-796bbdcf4f-gvngw\" (UID: \"d43df1e2-591a-43e2-a7e2-f48459125711\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gvngw" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065526 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcgxg\" (UniqueName: \"kubernetes.io/projected/d43df1e2-591a-43e2-a7e2-f48459125711-kube-api-access-xcgxg\") pod \"openshift-apiserver-operator-796bbdcf4f-gvngw\" (UID: \"d43df1e2-591a-43e2-a7e2-f48459125711\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gvngw" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065545 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6dc754c0-8f17-402b-9bd4-be033eb940ba-oauth-serving-cert\") pod \"console-f9d7485db-h97xq\" (UID: \"6dc754c0-8f17-402b-9bd4-be033eb940ba\") " pod="openshift-console/console-f9d7485db-h97xq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065578 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c75a371-7547-47b8-ac59-d296d642cd5c-serving-cert\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065594 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t76v\" (UniqueName: \"kubernetes.io/projected/77444ace-be14-4606-898d-565c52bec7b0-kube-api-access-8t76v\") pod \"openshift-controller-manager-operator-756b6f6bc6-nwdj8\" (UID: \"77444ace-be14-4606-898d-565c52bec7b0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwdj8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065618 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6dc754c0-8f17-402b-9bd4-be033eb940ba-console-config\") pod \"console-f9d7485db-h97xq\" (UID: \"6dc754c0-8f17-402b-9bd4-be033eb940ba\") " pod="openshift-console/console-f9d7485db-h97xq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065635 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b01a56e9-ee30-4945-b582-5ff927104c4c-serving-cert\") pod \"controller-manager-879f6c89f-jpwl9\" (UID: \"b01a56e9-ee30-4945-b582-5ff927104c4c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpwl9" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065650 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77444ace-be14-4606-898d-565c52bec7b0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nwdj8\" (UID: \"77444ace-be14-4606-898d-565c52bec7b0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwdj8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065667 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93121b3f-c7da-4e21-8fa0-7f8bf08a05ef-config\") pod \"route-controller-manager-6576b87f9c-hmvx4\" (UID: \"93121b3f-c7da-4e21-8fa0-7f8bf08a05ef\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065684 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1c75a371-7547-47b8-ac59-d296d642cd5c-etcd-serving-ca\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065717 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b11452b4-794b-41a4-a700-0b541916c6ad-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9qkhp\" (UID: \"b11452b4-794b-41a4-a700-0b541916c6ad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065733 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b01a56e9-ee30-4945-b582-5ff927104c4c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-jpwl9\" (UID: \"b01a56e9-ee30-4945-b582-5ff927104c4c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpwl9" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065748 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfkm6\" (UniqueName: \"kubernetes.io/projected/93121b3f-c7da-4e21-8fa0-7f8bf08a05ef-kube-api-access-hfkm6\") pod \"route-controller-manager-6576b87f9c-hmvx4\" (UID: \"93121b3f-c7da-4e21-8fa0-7f8bf08a05ef\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065763 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1c75a371-7547-47b8-ac59-d296d642cd5c-node-pullsecrets\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065778 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5ae0ccc-a50b-46d1-b887-28840703ab87-config\") pod \"machine-api-operator-5694c8668f-2mx6f\" (UID: \"f5ae0ccc-a50b-46d1-b887-28840703ab87\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2mx6f" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065797 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b01a56e9-ee30-4945-b582-5ff927104c4c-client-ca\") pod \"controller-manager-879f6c89f-jpwl9\" (UID: \"b01a56e9-ee30-4945-b582-5ff927104c4c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpwl9" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065812 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b11452b4-794b-41a4-a700-0b541916c6ad-etcd-client\") pod \"apiserver-7bbb656c7d-9qkhp\" (UID: \"b11452b4-794b-41a4-a700-0b541916c6ad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065837 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2426c\" (UniqueName: \"kubernetes.io/projected/1c75a371-7547-47b8-ac59-d296d642cd5c-kube-api-access-2426c\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065855 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6dc754c0-8f17-402b-9bd4-be033eb940ba-service-ca\") pod \"console-f9d7485db-h97xq\" (UID: \"6dc754c0-8f17-402b-9bd4-be033eb940ba\") " pod="openshift-console/console-f9d7485db-h97xq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065874 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b11452b4-794b-41a4-a700-0b541916c6ad-audit-policies\") pod \"apiserver-7bbb656c7d-9qkhp\" (UID: \"b11452b4-794b-41a4-a700-0b541916c6ad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065892 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6dc754c0-8f17-402b-9bd4-be033eb940ba-trusted-ca-bundle\") pod \"console-f9d7485db-h97xq\" (UID: \"6dc754c0-8f17-402b-9bd4-be033eb940ba\") " pod="openshift-console/console-f9d7485db-h97xq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065907 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1c75a371-7547-47b8-ac59-d296d642cd5c-audit-dir\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065924 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d94aad87-f9aa-4c6b-9846-509b3c6ad6b5-config\") pod \"machine-approver-56656f9798-fjh9n\" (UID: \"d94aad87-f9aa-4c6b-9846-509b3c6ad6b5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjh9n" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.065940 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6dc754c0-8f17-402b-9bd4-be033eb940ba-console-oauth-config\") pod \"console-f9d7485db-h97xq\" (UID: \"6dc754c0-8f17-402b-9bd4-be033eb940ba\") " pod="openshift-console/console-f9d7485db-h97xq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.066009 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d94aad87-f9aa-4c6b-9846-509b3c6ad6b5-auth-proxy-config\") pod \"machine-approver-56656f9798-fjh9n\" (UID: \"d94aad87-f9aa-4c6b-9846-509b3c6ad6b5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjh9n" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.066027 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1c75a371-7547-47b8-ac59-d296d642cd5c-encryption-config\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.066041 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77444ace-be14-4606-898d-565c52bec7b0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nwdj8\" (UID: \"77444ace-be14-4606-898d-565c52bec7b0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwdj8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.066058 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8169e27a-4573-4d03-b3e8-0c072a2efbe7-serving-cert\") pod \"console-operator-58897d9998-wwpj8\" (UID: \"8169e27a-4573-4d03-b3e8-0c072a2efbe7\") " pod="openshift-console-operator/console-operator-58897d9998-wwpj8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.066073 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c75a371-7547-47b8-ac59-d296d642cd5c-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.066089 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vpn5\" (UniqueName: \"kubernetes.io/projected/f5ae0ccc-a50b-46d1-b887-28840703ab87-kube-api-access-7vpn5\") pod \"machine-api-operator-5694c8668f-2mx6f\" (UID: \"f5ae0ccc-a50b-46d1-b887-28840703ab87\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2mx6f" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.066105 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6dc754c0-8f17-402b-9bd4-be033eb940ba-console-serving-cert\") pod \"console-f9d7485db-h97xq\" (UID: \"6dc754c0-8f17-402b-9bd4-be033eb940ba\") " pod="openshift-console/console-f9d7485db-h97xq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.066120 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzzgx\" (UniqueName: \"kubernetes.io/projected/b01a56e9-ee30-4945-b582-5ff927104c4c-kube-api-access-lzzgx\") pod \"controller-manager-879f6c89f-jpwl9\" (UID: \"b01a56e9-ee30-4945-b582-5ff927104c4c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpwl9" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.066134 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1c75a371-7547-47b8-ac59-d296d642cd5c-image-import-ca\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.066157 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b01a56e9-ee30-4945-b582-5ff927104c4c-config\") pod \"controller-manager-879f6c89f-jpwl9\" (UID: \"b01a56e9-ee30-4945-b582-5ff927104c4c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpwl9" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.066175 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b11452b4-794b-41a4-a700-0b541916c6ad-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9qkhp\" (UID: \"b11452b4-794b-41a4-a700-0b541916c6ad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.066192 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91fbe793-06cf-41b4-b24b-a57657dc05f2-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-sl682\" (UID: \"91fbe793-06cf-41b4-b24b-a57657dc05f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sl682" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.066221 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91fbe793-06cf-41b4-b24b-a57657dc05f2-service-ca-bundle\") pod \"authentication-operator-69f744f599-sl682\" (UID: \"91fbe793-06cf-41b4-b24b-a57657dc05f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sl682" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.066238 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j564s\" (UniqueName: \"kubernetes.io/projected/b11452b4-794b-41a4-a700-0b541916c6ad-kube-api-access-j564s\") pod \"apiserver-7bbb656c7d-9qkhp\" (UID: \"b11452b4-794b-41a4-a700-0b541916c6ad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.066268 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4xcw\" (UniqueName: \"kubernetes.io/projected/6dc754c0-8f17-402b-9bd4-be033eb940ba-kube-api-access-c4xcw\") pod \"console-f9d7485db-h97xq\" (UID: \"6dc754c0-8f17-402b-9bd4-be033eb940ba\") " pod="openshift-console/console-f9d7485db-h97xq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.066283 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1c75a371-7547-47b8-ac59-d296d642cd5c-audit\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.066299 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-454bf\" (UniqueName: \"kubernetes.io/projected/d94aad87-f9aa-4c6b-9846-509b3c6ad6b5-kube-api-access-454bf\") pod \"machine-approver-56656f9798-fjh9n\" (UID: \"d94aad87-f9aa-4c6b-9846-509b3c6ad6b5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjh9n" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.066318 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93121b3f-c7da-4e21-8fa0-7f8bf08a05ef-serving-cert\") pod \"route-controller-manager-6576b87f9c-hmvx4\" (UID: \"93121b3f-c7da-4e21-8fa0-7f8bf08a05ef\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.066354 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b11452b4-794b-41a4-a700-0b541916c6ad-audit-dir\") pod \"apiserver-7bbb656c7d-9qkhp\" (UID: \"b11452b4-794b-41a4-a700-0b541916c6ad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.066546 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.066640 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.066809 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.067738 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.069191 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.069604 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.069828 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.071574 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.071778 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.072453 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wm59s"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.073451 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l5rcq"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.074026 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l5rcq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.074387 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wm59s" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.074481 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.074637 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.074728 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.077412 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.079431 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.079786 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.083511 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.083976 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.086898 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.094063 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hcvsv"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.095027 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lm2zv"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.095616 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-5jvq8"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.095743 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lm2zv" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.096154 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.096213 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-5jvq8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.099047 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rdqrx"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.099763 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7fqvf"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.100105 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-k96m2"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.100299 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rdqrx" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.100530 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vm7fz"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.100378 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7fqvf" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.100886 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-vm7fz" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.100997 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-k96m2" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.106554 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.119101 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tzhdc"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.119810 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tzhdc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.121656 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b76"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.122402 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b76" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.122820 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2vr9c"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.123782 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2vr9c" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.124648 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zpxp6"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.125157 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zpxp6" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.125833 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565555-5tnbp"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.126223 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.126734 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565555-5tnbp" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.127114 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-2f87m"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.127756 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2f87m" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.127886 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wrsj5"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.128497 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wrsj5" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.129040 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-54ctz"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.129646 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-54ctz" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.129906 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qvlnd"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.130938 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qvlnd" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.131246 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565558-wvlb8"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.131963 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565558-wvlb8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.133243 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-2mx6f"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.133963 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gvngw"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.139566 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lxmpv"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.145810 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lxmpv" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.146242 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.147141 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.151701 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jpwl9"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.157751 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-sl682"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.158241 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-wbmz4"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.159397 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-wbmz4" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.161142 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwdj8"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.162462 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-fvhqg"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.165090 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-m25b8"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.167916 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8169e27a-4573-4d03-b3e8-0c072a2efbe7-serving-cert\") pod \"console-operator-58897d9998-wwpj8\" (UID: \"8169e27a-4573-4d03-b3e8-0c072a2efbe7\") " pod="openshift-console-operator/console-operator-58897d9998-wwpj8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.167959 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vrpz\" (UniqueName: \"kubernetes.io/projected/bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf-kube-api-access-7vrpz\") pod \"etcd-operator-b45778765-8827r\" (UID: \"bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8827r" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.167995 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a212fe68-9219-479d-bf36-26b08daf31ab-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vdfcr\" (UID: \"a212fe68-9219-479d-bf36-26b08daf31ab\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vdfcr" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.168018 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c75a371-7547-47b8-ac59-d296d642cd5c-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.168041 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vpn5\" (UniqueName: \"kubernetes.io/projected/f5ae0ccc-a50b-46d1-b887-28840703ab87-kube-api-access-7vpn5\") pod \"machine-api-operator-5694c8668f-2mx6f\" (UID: \"f5ae0ccc-a50b-46d1-b887-28840703ab87\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2mx6f" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.168061 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5847be39-eb3d-4802-8f96-771f91078979-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fk4xm\" (UID: \"5847be39-eb3d-4802-8f96-771f91078979\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fk4xm" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.168080 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c09ae4bc-4073-442f-8cbc-0f42ea00a35d-trusted-ca\") pod \"ingress-operator-5b745b69d9-9sf6z\" (UID: \"c09ae4bc-4073-442f-8cbc-0f42ea00a35d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9sf6z" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.168102 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6dc754c0-8f17-402b-9bd4-be033eb940ba-console-serving-cert\") pod \"console-f9d7485db-h97xq\" (UID: \"6dc754c0-8f17-402b-9bd4-be033eb940ba\") " pod="openshift-console/console-f9d7485db-h97xq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.168126 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.168189 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a212fe68-9219-479d-bf36-26b08daf31ab-config\") pod \"kube-apiserver-operator-766d6c64bb-vdfcr\" (UID: \"a212fe68-9219-479d-bf36-26b08daf31ab\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vdfcr" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.168213 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzzgx\" (UniqueName: \"kubernetes.io/projected/b01a56e9-ee30-4945-b582-5ff927104c4c-kube-api-access-lzzgx\") pod \"controller-manager-879f6c89f-jpwl9\" (UID: \"b01a56e9-ee30-4945-b582-5ff927104c4c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpwl9" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.168235 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdzmn\" (UniqueName: \"kubernetes.io/projected/6f5dca39-298b-4814-b77c-43dd0cbc4025-kube-api-access-hdzmn\") pod \"downloads-7954f5f757-m25b8\" (UID: \"6f5dca39-298b-4814-b77c-43dd0cbc4025\") " pod="openshift-console/downloads-7954f5f757-m25b8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.168254 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf7k9\" (UniqueName: \"kubernetes.io/projected/c09ae4bc-4073-442f-8cbc-0f42ea00a35d-kube-api-access-qf7k9\") pod \"ingress-operator-5b745b69d9-9sf6z\" (UID: \"c09ae4bc-4073-442f-8cbc-0f42ea00a35d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9sf6z" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.168284 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1c75a371-7547-47b8-ac59-d296d642cd5c-image-import-ca\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.168304 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b01a56e9-ee30-4945-b582-5ff927104c4c-config\") pod \"controller-manager-879f6c89f-jpwl9\" (UID: \"b01a56e9-ee30-4945-b582-5ff927104c4c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpwl9" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.168321 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b11452b4-794b-41a4-a700-0b541916c6ad-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9qkhp\" (UID: \"b11452b4-794b-41a4-a700-0b541916c6ad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.168344 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91fbe793-06cf-41b4-b24b-a57657dc05f2-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-sl682\" (UID: \"91fbe793-06cf-41b4-b24b-a57657dc05f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sl682" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.168397 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91fbe793-06cf-41b4-b24b-a57657dc05f2-service-ca-bundle\") pod \"authentication-operator-69f744f599-sl682\" (UID: \"91fbe793-06cf-41b4-b24b-a57657dc05f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sl682" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.168418 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j564s\" (UniqueName: \"kubernetes.io/projected/b11452b4-794b-41a4-a700-0b541916c6ad-kube-api-access-j564s\") pod \"apiserver-7bbb656c7d-9qkhp\" (UID: \"b11452b4-794b-41a4-a700-0b541916c6ad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.168442 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4xcw\" (UniqueName: \"kubernetes.io/projected/6dc754c0-8f17-402b-9bd4-be033eb940ba-kube-api-access-c4xcw\") pod \"console-f9d7485db-h97xq\" (UID: \"6dc754c0-8f17-402b-9bd4-be033eb940ba\") " pod="openshift-console/console-f9d7485db-h97xq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.168463 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.168483 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1c75a371-7547-47b8-ac59-d296d642cd5c-audit\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.168504 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-454bf\" (UniqueName: \"kubernetes.io/projected/d94aad87-f9aa-4c6b-9846-509b3c6ad6b5-kube-api-access-454bf\") pod \"machine-approver-56656f9798-fjh9n\" (UID: \"d94aad87-f9aa-4c6b-9846-509b3c6ad6b5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjh9n" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.168524 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93121b3f-c7da-4e21-8fa0-7f8bf08a05ef-serving-cert\") pod \"route-controller-manager-6576b87f9c-hmvx4\" (UID: \"93121b3f-c7da-4e21-8fa0-7f8bf08a05ef\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.168542 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b11452b4-794b-41a4-a700-0b541916c6ad-audit-dir\") pod \"apiserver-7bbb656c7d-9qkhp\" (UID: \"b11452b4-794b-41a4-a700-0b541916c6ad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.168562 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.168746 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d94aad87-f9aa-4c6b-9846-509b3c6ad6b5-machine-approver-tls\") pod \"machine-approver-56656f9798-fjh9n\" (UID: \"d94aad87-f9aa-4c6b-9846-509b3c6ad6b5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjh9n" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.170472 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b11452b4-794b-41a4-a700-0b541916c6ad-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9qkhp\" (UID: \"b11452b4-794b-41a4-a700-0b541916c6ad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.170234 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d43df1e2-591a-43e2-a7e2-f48459125711-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-gvngw\" (UID: \"d43df1e2-591a-43e2-a7e2-f48459125711\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gvngw" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.170668 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b11452b4-794b-41a4-a700-0b541916c6ad-encryption-config\") pod \"apiserver-7bbb656c7d-9qkhp\" (UID: \"b11452b4-794b-41a4-a700-0b541916c6ad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.171631 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf-config\") pod \"etcd-operator-b45778765-8827r\" (UID: \"bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8827r" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.172156 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2db36f46-e19a-4b7d-a94f-157f65671639-audit-policies\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.172205 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2db36f46-e19a-4b7d-a94f-157f65671639-audit-dir\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.172231 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91fbe793-06cf-41b4-b24b-a57657dc05f2-config\") pod \"authentication-operator-69f744f599-sl682\" (UID: \"91fbe793-06cf-41b4-b24b-a57657dc05f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sl682" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.172254 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg6bm\" (UniqueName: \"kubernetes.io/projected/5847be39-eb3d-4802-8f96-771f91078979-kube-api-access-vg6bm\") pod \"cluster-image-registry-operator-dc59b4c8b-fk4xm\" (UID: \"5847be39-eb3d-4802-8f96-771f91078979\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fk4xm" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.172298 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1c75a371-7547-47b8-ac59-d296d642cd5c-etcd-client\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.172419 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp5v6\" (UniqueName: \"kubernetes.io/projected/91fbe793-06cf-41b4-b24b-a57657dc05f2-kube-api-access-xp5v6\") pod \"authentication-operator-69f744f599-sl682\" (UID: \"91fbe793-06cf-41b4-b24b-a57657dc05f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sl682" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.172492 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.172563 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpx66\" (UniqueName: \"kubernetes.io/projected/8169e27a-4573-4d03-b3e8-0c072a2efbe7-kube-api-access-dpx66\") pod \"console-operator-58897d9998-wwpj8\" (UID: \"8169e27a-4573-4d03-b3e8-0c072a2efbe7\") " pod="openshift-console-operator/console-operator-58897d9998-wwpj8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.172622 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91fbe793-06cf-41b4-b24b-a57657dc05f2-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-sl682\" (UID: \"91fbe793-06cf-41b4-b24b-a57657dc05f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sl682" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.172633 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.172691 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppbbp\" (UniqueName: \"kubernetes.io/projected/2db36f46-e19a-4b7d-a94f-157f65671639-kube-api-access-ppbbp\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.172763 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8169e27a-4573-4d03-b3e8-0c072a2efbe7-config\") pod \"console-operator-58897d9998-wwpj8\" (UID: \"8169e27a-4573-4d03-b3e8-0c072a2efbe7\") " pod="openshift-console-operator/console-operator-58897d9998-wwpj8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.172844 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b11452b4-794b-41a4-a700-0b541916c6ad-serving-cert\") pod \"apiserver-7bbb656c7d-9qkhp\" (UID: \"b11452b4-794b-41a4-a700-0b541916c6ad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.174040 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1c75a371-7547-47b8-ac59-d296d642cd5c-audit\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.174218 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-h97xq"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.174134 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ncgg\" (UniqueName: \"kubernetes.io/projected/db1585ab-0970-4bd5-acba-7eda8ed2d40f-kube-api-access-5ncgg\") pod \"cluster-samples-operator-665b6dd947-fs46x\" (UID: \"db1585ab-0970-4bd5-acba-7eda8ed2d40f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fs46x" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.174595 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c75a371-7547-47b8-ac59-d296d642cd5c-config\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.175068 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1c75a371-7547-47b8-ac59-d296d642cd5c-image-import-ca\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.174817 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f5ae0ccc-a50b-46d1-b887-28840703ab87-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-2mx6f\" (UID: \"f5ae0ccc-a50b-46d1-b887-28840703ab87\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2mx6f" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.175510 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c75a371-7547-47b8-ac59-d296d642cd5c-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.171733 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91fbe793-06cf-41b4-b24b-a57657dc05f2-service-ca-bundle\") pod \"authentication-operator-69f744f599-sl682\" (UID: \"91fbe793-06cf-41b4-b24b-a57657dc05f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sl682" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.176313 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/93121b3f-c7da-4e21-8fa0-7f8bf08a05ef-client-ca\") pod \"route-controller-manager-6576b87f9c-hmvx4\" (UID: \"93121b3f-c7da-4e21-8fa0-7f8bf08a05ef\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.179556 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b11452b4-794b-41a4-a700-0b541916c6ad-audit-dir\") pod \"apiserver-7bbb656c7d-9qkhp\" (UID: \"b11452b4-794b-41a4-a700-0b541916c6ad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.179632 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf-etcd-client\") pod \"etcd-operator-b45778765-8827r\" (UID: \"bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8827r" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.179770 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f5ae0ccc-a50b-46d1-b887-28840703ab87-images\") pod \"machine-api-operator-5694c8668f-2mx6f\" (UID: \"f5ae0ccc-a50b-46d1-b887-28840703ab87\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2mx6f" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.180029 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8169e27a-4573-4d03-b3e8-0c072a2efbe7-trusted-ca\") pod \"console-operator-58897d9998-wwpj8\" (UID: \"8169e27a-4573-4d03-b3e8-0c072a2efbe7\") " pod="openshift-console-operator/console-operator-58897d9998-wwpj8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.180724 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f5ae0ccc-a50b-46d1-b887-28840703ab87-images\") pod \"machine-api-operator-5694c8668f-2mx6f\" (UID: \"f5ae0ccc-a50b-46d1-b887-28840703ab87\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2mx6f" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.180731 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/93121b3f-c7da-4e21-8fa0-7f8bf08a05ef-client-ca\") pod \"route-controller-manager-6576b87f9c-hmvx4\" (UID: \"93121b3f-c7da-4e21-8fa0-7f8bf08a05ef\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.180803 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91fbe793-06cf-41b4-b24b-a57657dc05f2-serving-cert\") pod \"authentication-operator-69f744f599-sl682\" (UID: \"91fbe793-06cf-41b4-b24b-a57657dc05f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sl682" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.180849 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d43df1e2-591a-43e2-a7e2-f48459125711-config\") pod \"openshift-apiserver-operator-796bbdcf4f-gvngw\" (UID: \"d43df1e2-591a-43e2-a7e2-f48459125711\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gvngw" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.180881 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcgxg\" (UniqueName: \"kubernetes.io/projected/d43df1e2-591a-43e2-a7e2-f48459125711-kube-api-access-xcgxg\") pod \"openshift-apiserver-operator-796bbdcf4f-gvngw\" (UID: \"d43df1e2-591a-43e2-a7e2-f48459125711\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gvngw" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.181013 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.181051 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t76v\" (UniqueName: \"kubernetes.io/projected/77444ace-be14-4606-898d-565c52bec7b0-kube-api-access-8t76v\") pod \"openshift-controller-manager-operator-756b6f6bc6-nwdj8\" (UID: \"77444ace-be14-4606-898d-565c52bec7b0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwdj8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.181080 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/db1585ab-0970-4bd5-acba-7eda8ed2d40f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fs46x\" (UID: \"db1585ab-0970-4bd5-acba-7eda8ed2d40f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fs46x" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.181883 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8169e27a-4573-4d03-b3e8-0c072a2efbe7-trusted-ca\") pod \"console-operator-58897d9998-wwpj8\" (UID: \"8169e27a-4573-4d03-b3e8-0c072a2efbe7\") " pod="openshift-console-operator/console-operator-58897d9998-wwpj8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.182893 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.183088 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d43df1e2-591a-43e2-a7e2-f48459125711-config\") pod \"openshift-apiserver-operator-796bbdcf4f-gvngw\" (UID: \"d43df1e2-591a-43e2-a7e2-f48459125711\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gvngw" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.183179 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6dc754c0-8f17-402b-9bd4-be033eb940ba-oauth-serving-cert\") pod \"console-f9d7485db-h97xq\" (UID: \"6dc754c0-8f17-402b-9bd4-be033eb940ba\") " pod="openshift-console/console-f9d7485db-h97xq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.183339 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c75a371-7547-47b8-ac59-d296d642cd5c-serving-cert\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.183388 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.183450 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.183475 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93121b3f-c7da-4e21-8fa0-7f8bf08a05ef-config\") pod \"route-controller-manager-6576b87f9c-hmvx4\" (UID: \"93121b3f-c7da-4e21-8fa0-7f8bf08a05ef\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.184247 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6dc754c0-8f17-402b-9bd4-be033eb940ba-console-config\") pod \"console-f9d7485db-h97xq\" (UID: \"6dc754c0-8f17-402b-9bd4-be033eb940ba\") " pod="openshift-console/console-f9d7485db-h97xq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.184338 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b01a56e9-ee30-4945-b582-5ff927104c4c-serving-cert\") pod \"controller-manager-879f6c89f-jpwl9\" (UID: \"b01a56e9-ee30-4945-b582-5ff927104c4c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpwl9" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.184375 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77444ace-be14-4606-898d-565c52bec7b0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nwdj8\" (UID: \"77444ace-be14-4606-898d-565c52bec7b0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwdj8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.184559 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5847be39-eb3d-4802-8f96-771f91078979-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fk4xm\" (UID: \"5847be39-eb3d-4802-8f96-771f91078979\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fk4xm" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.184599 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf-etcd-ca\") pod \"etcd-operator-b45778765-8827r\" (UID: \"bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8827r" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.184860 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6dc754c0-8f17-402b-9bd4-be033eb940ba-oauth-serving-cert\") pod \"console-f9d7485db-h97xq\" (UID: \"6dc754c0-8f17-402b-9bd4-be033eb940ba\") " pod="openshift-console/console-f9d7485db-h97xq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.185383 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91fbe793-06cf-41b4-b24b-a57657dc05f2-config\") pod \"authentication-operator-69f744f599-sl682\" (UID: \"91fbe793-06cf-41b4-b24b-a57657dc05f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sl682" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.185703 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wwpj8"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.186049 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d94aad87-f9aa-4c6b-9846-509b3c6ad6b5-machine-approver-tls\") pod \"machine-approver-56656f9798-fjh9n\" (UID: \"d94aad87-f9aa-4c6b-9846-509b3c6ad6b5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjh9n" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.186119 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6dc754c0-8f17-402b-9bd4-be033eb940ba-console-config\") pod \"console-f9d7485db-h97xq\" (UID: \"6dc754c0-8f17-402b-9bd4-be033eb940ba\") " pod="openshift-console/console-f9d7485db-h97xq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.186227 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91fbe793-06cf-41b4-b24b-a57657dc05f2-serving-cert\") pod \"authentication-operator-69f744f599-sl682\" (UID: \"91fbe793-06cf-41b4-b24b-a57657dc05f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sl682" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.186279 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c75a371-7547-47b8-ac59-d296d642cd5c-config\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.186498 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1c75a371-7547-47b8-ac59-d296d642cd5c-etcd-client\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.186782 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b11452b4-794b-41a4-a700-0b541916c6ad-serving-cert\") pod \"apiserver-7bbb656c7d-9qkhp\" (UID: \"b11452b4-794b-41a4-a700-0b541916c6ad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.187101 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93121b3f-c7da-4e21-8fa0-7f8bf08a05ef-serving-cert\") pod \"route-controller-manager-6576b87f9c-hmvx4\" (UID: \"93121b3f-c7da-4e21-8fa0-7f8bf08a05ef\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.187267 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8169e27a-4573-4d03-b3e8-0c072a2efbe7-serving-cert\") pod \"console-operator-58897d9998-wwpj8\" (UID: \"8169e27a-4573-4d03-b3e8-0c072a2efbe7\") " pod="openshift-console-operator/console-operator-58897d9998-wwpj8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.187379 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6dc754c0-8f17-402b-9bd4-be033eb940ba-console-serving-cert\") pod \"console-f9d7485db-h97xq\" (UID: \"6dc754c0-8f17-402b-9bd4-be033eb940ba\") " pod="openshift-console/console-f9d7485db-h97xq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.187444 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77444ace-be14-4606-898d-565c52bec7b0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nwdj8\" (UID: \"77444ace-be14-4606-898d-565c52bec7b0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwdj8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.187563 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1c75a371-7547-47b8-ac59-d296d642cd5c-etcd-serving-ca\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.187595 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b11452b4-794b-41a4-a700-0b541916c6ad-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9qkhp\" (UID: \"b11452b4-794b-41a4-a700-0b541916c6ad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.187664 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c09ae4bc-4073-442f-8cbc-0f42ea00a35d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9sf6z\" (UID: \"c09ae4bc-4073-442f-8cbc-0f42ea00a35d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9sf6z" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.188075 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b11452b4-794b-41a4-a700-0b541916c6ad-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9qkhp\" (UID: \"b11452b4-794b-41a4-a700-0b541916c6ad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.188120 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b01a56e9-ee30-4945-b582-5ff927104c4c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-jpwl9\" (UID: \"b01a56e9-ee30-4945-b582-5ff927104c4c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpwl9" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.188141 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfkm6\" (UniqueName: \"kubernetes.io/projected/93121b3f-c7da-4e21-8fa0-7f8bf08a05ef-kube-api-access-hfkm6\") pod \"route-controller-manager-6576b87f9c-hmvx4\" (UID: \"93121b3f-c7da-4e21-8fa0-7f8bf08a05ef\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.188185 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1c75a371-7547-47b8-ac59-d296d642cd5c-node-pullsecrets\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.188206 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5ae0ccc-a50b-46d1-b887-28840703ab87-config\") pod \"machine-api-operator-5694c8668f-2mx6f\" (UID: \"f5ae0ccc-a50b-46d1-b887-28840703ab87\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2mx6f" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.188219 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8169e27a-4573-4d03-b3e8-0c072a2efbe7-config\") pod \"console-operator-58897d9998-wwpj8\" (UID: \"8169e27a-4573-4d03-b3e8-0c072a2efbe7\") " pod="openshift-console-operator/console-operator-58897d9998-wwpj8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.188225 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b01a56e9-ee30-4945-b582-5ff927104c4c-client-ca\") pod \"controller-manager-879f6c89f-jpwl9\" (UID: \"b01a56e9-ee30-4945-b582-5ff927104c4c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpwl9" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.188821 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b01a56e9-ee30-4945-b582-5ff927104c4c-client-ca\") pod \"controller-manager-879f6c89f-jpwl9\" (UID: \"b01a56e9-ee30-4945-b582-5ff927104c4c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpwl9" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.188865 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1c75a371-7547-47b8-ac59-d296d642cd5c-node-pullsecrets\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.189049 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b11452b4-794b-41a4-a700-0b541916c6ad-etcd-client\") pod \"apiserver-7bbb656c7d-9qkhp\" (UID: \"b11452b4-794b-41a4-a700-0b541916c6ad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.189207 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93121b3f-c7da-4e21-8fa0-7f8bf08a05ef-config\") pod \"route-controller-manager-6576b87f9c-hmvx4\" (UID: \"93121b3f-c7da-4e21-8fa0-7f8bf08a05ef\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.189423 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b01a56e9-ee30-4945-b582-5ff927104c4c-config\") pod \"controller-manager-879f6c89f-jpwl9\" (UID: \"b01a56e9-ee30-4945-b582-5ff927104c4c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpwl9" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.189448 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5ae0ccc-a50b-46d1-b887-28840703ab87-config\") pod \"machine-api-operator-5694c8668f-2mx6f\" (UID: \"f5ae0ccc-a50b-46d1-b887-28840703ab87\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2mx6f" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.189783 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.190381 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a212fe68-9219-479d-bf36-26b08daf31ab-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vdfcr\" (UID: \"a212fe68-9219-479d-bf36-26b08daf31ab\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vdfcr" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.190495 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cnpf\" (UniqueName: \"kubernetes.io/projected/8796bcce-0185-4494-8485-66476bdde45c-kube-api-access-9cnpf\") pod \"dns-operator-744455d44c-fvhqg\" (UID: \"8796bcce-0185-4494-8485-66476bdde45c\") " pod="openshift-dns-operator/dns-operator-744455d44c-fvhqg" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.190601 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2426c\" (UniqueName: \"kubernetes.io/projected/1c75a371-7547-47b8-ac59-d296d642cd5c-kube-api-access-2426c\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.190703 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf-serving-cert\") pod \"etcd-operator-b45778765-8827r\" (UID: \"bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8827r" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.190436 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b01a56e9-ee30-4945-b582-5ff927104c4c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-jpwl9\" (UID: \"b01a56e9-ee30-4945-b582-5ff927104c4c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpwl9" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.190810 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1c75a371-7547-47b8-ac59-d296d642cd5c-etcd-serving-ca\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.190437 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b01a56e9-ee30-4945-b582-5ff927104c4c-serving-cert\") pod \"controller-manager-879f6c89f-jpwl9\" (UID: \"b01a56e9-ee30-4945-b582-5ff927104c4c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpwl9" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.190900 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d43df1e2-591a-43e2-a7e2-f48459125711-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-gvngw\" (UID: \"d43df1e2-591a-43e2-a7e2-f48459125711\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gvngw" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.190975 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf-etcd-service-ca\") pod \"etcd-operator-b45778765-8827r\" (UID: \"bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8827r" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.191122 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6dc754c0-8f17-402b-9bd4-be033eb940ba-service-ca\") pod \"console-f9d7485db-h97xq\" (UID: \"6dc754c0-8f17-402b-9bd4-be033eb940ba\") " pod="openshift-console/console-f9d7485db-h97xq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.191218 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5847be39-eb3d-4802-8f96-771f91078979-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fk4xm\" (UID: \"5847be39-eb3d-4802-8f96-771f91078979\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fk4xm" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.191302 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d94aad87-f9aa-4c6b-9846-509b3c6ad6b5-config\") pod \"machine-approver-56656f9798-fjh9n\" (UID: \"d94aad87-f9aa-4c6b-9846-509b3c6ad6b5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjh9n" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.191384 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b11452b4-794b-41a4-a700-0b541916c6ad-audit-policies\") pod \"apiserver-7bbb656c7d-9qkhp\" (UID: \"b11452b4-794b-41a4-a700-0b541916c6ad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.191466 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6dc754c0-8f17-402b-9bd4-be033eb940ba-trusted-ca-bundle\") pod \"console-f9d7485db-h97xq\" (UID: \"6dc754c0-8f17-402b-9bd4-be033eb940ba\") " pod="openshift-console/console-f9d7485db-h97xq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.191541 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1c75a371-7547-47b8-ac59-d296d642cd5c-audit-dir\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.191618 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c09ae4bc-4073-442f-8cbc-0f42ea00a35d-metrics-tls\") pod \"ingress-operator-5b745b69d9-9sf6z\" (UID: \"c09ae4bc-4073-442f-8cbc-0f42ea00a35d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9sf6z" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.191689 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6dc754c0-8f17-402b-9bd4-be033eb940ba-console-oauth-config\") pod \"console-f9d7485db-h97xq\" (UID: \"6dc754c0-8f17-402b-9bd4-be033eb940ba\") " pod="openshift-console/console-f9d7485db-h97xq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.191762 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d94aad87-f9aa-4c6b-9846-509b3c6ad6b5-auth-proxy-config\") pod \"machine-approver-56656f9798-fjh9n\" (UID: \"d94aad87-f9aa-4c6b-9846-509b3c6ad6b5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjh9n" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.191852 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.191934 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.192041 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1c75a371-7547-47b8-ac59-d296d642cd5c-encryption-config\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.192128 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77444ace-be14-4606-898d-565c52bec7b0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nwdj8\" (UID: \"77444ace-be14-4606-898d-565c52bec7b0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwdj8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.192232 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8796bcce-0185-4494-8485-66476bdde45c-metrics-tls\") pod \"dns-operator-744455d44c-fvhqg\" (UID: \"8796bcce-0185-4494-8485-66476bdde45c\") " pod="openshift-dns-operator/dns-operator-744455d44c-fvhqg" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.191779 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d94aad87-f9aa-4c6b-9846-509b3c6ad6b5-config\") pod \"machine-approver-56656f9798-fjh9n\" (UID: \"d94aad87-f9aa-4c6b-9846-509b3c6ad6b5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjh9n" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.192536 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d94aad87-f9aa-4c6b-9846-509b3c6ad6b5-auth-proxy-config\") pod \"machine-approver-56656f9798-fjh9n\" (UID: \"d94aad87-f9aa-4c6b-9846-509b3c6ad6b5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjh9n" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.192567 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c75a371-7547-47b8-ac59-d296d642cd5c-serving-cert\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.192626 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1c75a371-7547-47b8-ac59-d296d642cd5c-audit-dir\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.191394 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wm59s"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.192688 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7fqvf"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.193240 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b11452b4-794b-41a4-a700-0b541916c6ad-audit-policies\") pod \"apiserver-7bbb656c7d-9qkhp\" (UID: \"b11452b4-794b-41a4-a700-0b541916c6ad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.193548 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77444ace-be14-4606-898d-565c52bec7b0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nwdj8\" (UID: \"77444ace-be14-4606-898d-565c52bec7b0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwdj8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.193569 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6dc754c0-8f17-402b-9bd4-be033eb940ba-trusted-ca-bundle\") pod \"console-f9d7485db-h97xq\" (UID: \"6dc754c0-8f17-402b-9bd4-be033eb940ba\") " pod="openshift-console/console-f9d7485db-h97xq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.193847 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b11452b4-794b-41a4-a700-0b541916c6ad-encryption-config\") pod \"apiserver-7bbb656c7d-9qkhp\" (UID: \"b11452b4-794b-41a4-a700-0b541916c6ad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.194188 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b11452b4-794b-41a4-a700-0b541916c6ad-etcd-client\") pod \"apiserver-7bbb656c7d-9qkhp\" (UID: \"b11452b4-794b-41a4-a700-0b541916c6ad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.194575 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rn8sc"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.194586 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6dc754c0-8f17-402b-9bd4-be033eb940ba-service-ca\") pod \"console-f9d7485db-h97xq\" (UID: \"6dc754c0-8f17-402b-9bd4-be033eb940ba\") " pod="openshift-console/console-f9d7485db-h97xq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.195240 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.195575 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6dc754c0-8f17-402b-9bd4-be033eb940ba-console-oauth-config\") pod \"console-f9d7485db-h97xq\" (UID: \"6dc754c0-8f17-402b-9bd4-be033eb940ba\") " pod="openshift-console/console-f9d7485db-h97xq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.197576 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fs46x"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.198674 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8827r"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.200259 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2vr9c"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.201287 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lm2zv"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.202334 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vm7fz"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.203550 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f5ae0ccc-a50b-46d1-b887-28840703ab87-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-2mx6f\" (UID: \"f5ae0ccc-a50b-46d1-b887-28840703ab87\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2mx6f" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.203599 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9sf6z"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.204917 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.206352 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fk4xm"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.209127 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565555-5tnbp"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.210707 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1c75a371-7547-47b8-ac59-d296d642cd5c-encryption-config\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.211064 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vdfcr"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.212701 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.213867 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hcvsv"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.214929 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b76"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.216239 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l5rcq"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.217698 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-k96m2"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.220953 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tzhdc"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.222293 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rdqrx"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.223351 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lxmpv"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.224780 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-ngxwv"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.225970 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ngxwv" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.226166 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.226652 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qj9qq"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.228185 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zpxp6"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.229183 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wrsj5"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.230192 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565558-wvlb8"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.231180 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-54ctz"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.232162 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-2f87m"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.233252 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qvlnd"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.234441 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ngxwv"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.235407 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bwgwn"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.236615 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hwlpq"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.236941 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-bwgwn" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.238076 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hwlpq"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.238192 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hwlpq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.238916 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bwgwn"] Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.246070 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.265604 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.286375 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293160 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0c7b96e7-c74b-4608-a81f-92a7d977c7d9-stats-auth\") pod \"router-default-5444994796-5jvq8\" (UID: \"0c7b96e7-c74b-4608-a81f-92a7d977c7d9\") " pod="openshift-ingress/router-default-5444994796-5jvq8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293229 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293277 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/db1585ab-0970-4bd5-acba-7eda8ed2d40f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fs46x\" (UID: \"db1585ab-0970-4bd5-acba-7eda8ed2d40f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fs46x" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293298 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293315 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293359 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5847be39-eb3d-4802-8f96-771f91078979-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fk4xm\" (UID: \"5847be39-eb3d-4802-8f96-771f91078979\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fk4xm" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293391 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf-etcd-ca\") pod \"etcd-operator-b45778765-8827r\" (UID: \"bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8827r" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293413 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c09ae4bc-4073-442f-8cbc-0f42ea00a35d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9sf6z\" (UID: \"c09ae4bc-4073-442f-8cbc-0f42ea00a35d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9sf6z" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293449 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9srr\" (UniqueName: \"kubernetes.io/projected/b56fbd59-432d-4672-bc27-0cb80aa81405-kube-api-access-l9srr\") pod \"machine-config-operator-74547568cd-qvlnd\" (UID: \"b56fbd59-432d-4672-bc27-0cb80aa81405\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qvlnd" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293512 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293536 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a212fe68-9219-479d-bf36-26b08daf31ab-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vdfcr\" (UID: \"a212fe68-9219-479d-bf36-26b08daf31ab\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vdfcr" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293581 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjfnw\" (UniqueName: \"kubernetes.io/projected/31c7ed96-b981-4f85-9f5a-dee62216ecd9-kube-api-access-fjfnw\") pod \"service-ca-9c57cc56f-vm7fz\" (UID: \"31c7ed96-b981-4f85-9f5a-dee62216ecd9\") " pod="openshift-service-ca/service-ca-9c57cc56f-vm7fz" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293601 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cnpf\" (UniqueName: \"kubernetes.io/projected/8796bcce-0185-4494-8485-66476bdde45c-kube-api-access-9cnpf\") pod \"dns-operator-744455d44c-fvhqg\" (UID: \"8796bcce-0185-4494-8485-66476bdde45c\") " pod="openshift-dns-operator/dns-operator-744455d44c-fvhqg" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293631 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf-serving-cert\") pod \"etcd-operator-b45778765-8827r\" (UID: \"bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8827r" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293648 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c7b96e7-c74b-4608-a81f-92a7d977c7d9-metrics-certs\") pod \"router-default-5444994796-5jvq8\" (UID: \"0c7b96e7-c74b-4608-a81f-92a7d977c7d9\") " pod="openshift-ingress/router-default-5444994796-5jvq8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293672 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf-etcd-service-ca\") pod \"etcd-operator-b45778765-8827r\" (UID: \"bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8827r" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293690 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f790bc61-0a88-45de-b672-76b356fb8522-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lxmpv\" (UID: \"f790bc61-0a88-45de-b672-76b356fb8522\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lxmpv" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293710 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5847be39-eb3d-4802-8f96-771f91078979-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fk4xm\" (UID: \"5847be39-eb3d-4802-8f96-771f91078979\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fk4xm" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293731 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c09ae4bc-4073-442f-8cbc-0f42ea00a35d-metrics-tls\") pod \"ingress-operator-5b745b69d9-9sf6z\" (UID: \"c09ae4bc-4073-442f-8cbc-0f42ea00a35d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9sf6z" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293751 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gprh\" (UniqueName: \"kubernetes.io/projected/b4eb7061-dde4-44f1-943a-219d2f4f5071-kube-api-access-7gprh\") pod \"marketplace-operator-79b997595-7fqvf\" (UID: \"b4eb7061-dde4-44f1-943a-219d2f4f5071\") " pod="openshift-marketplace/marketplace-operator-79b997595-7fqvf" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293771 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293790 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293807 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8796bcce-0185-4494-8485-66476bdde45c-metrics-tls\") pod \"dns-operator-744455d44c-fvhqg\" (UID: \"8796bcce-0185-4494-8485-66476bdde45c\") " pod="openshift-dns-operator/dns-operator-744455d44c-fvhqg" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293823 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f790bc61-0a88-45de-b672-76b356fb8522-srv-cert\") pod \"olm-operator-6b444d44fb-lxmpv\" (UID: \"f790bc61-0a88-45de-b672-76b356fb8522\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lxmpv" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293851 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vrpz\" (UniqueName: \"kubernetes.io/projected/bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf-kube-api-access-7vrpz\") pod \"etcd-operator-b45778765-8827r\" (UID: \"bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8827r" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293872 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a212fe68-9219-479d-bf36-26b08daf31ab-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vdfcr\" (UID: \"a212fe68-9219-479d-bf36-26b08daf31ab\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vdfcr" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293894 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5847be39-eb3d-4802-8f96-771f91078979-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fk4xm\" (UID: \"5847be39-eb3d-4802-8f96-771f91078979\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fk4xm" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293912 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4eb7061-dde4-44f1-943a-219d2f4f5071-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7fqvf\" (UID: \"b4eb7061-dde4-44f1-943a-219d2f4f5071\") " pod="openshift-marketplace/marketplace-operator-79b997595-7fqvf" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293929 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c09ae4bc-4073-442f-8cbc-0f42ea00a35d-trusted-ca\") pod \"ingress-operator-5b745b69d9-9sf6z\" (UID: \"c09ae4bc-4073-442f-8cbc-0f42ea00a35d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9sf6z" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293949 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.293965 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a212fe68-9219-479d-bf36-26b08daf31ab-config\") pod \"kube-apiserver-operator-766d6c64bb-vdfcr\" (UID: \"a212fe68-9219-479d-bf36-26b08daf31ab\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vdfcr" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.294004 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdzmn\" (UniqueName: \"kubernetes.io/projected/6f5dca39-298b-4814-b77c-43dd0cbc4025-kube-api-access-hdzmn\") pod \"downloads-7954f5f757-m25b8\" (UID: \"6f5dca39-298b-4814-b77c-43dd0cbc4025\") " pod="openshift-console/downloads-7954f5f757-m25b8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.294020 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf7k9\" (UniqueName: \"kubernetes.io/projected/c09ae4bc-4073-442f-8cbc-0f42ea00a35d-kube-api-access-qf7k9\") pod \"ingress-operator-5b745b69d9-9sf6z\" (UID: \"c09ae4bc-4073-442f-8cbc-0f42ea00a35d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9sf6z" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.294088 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b4eb7061-dde4-44f1-943a-219d2f4f5071-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7fqvf\" (UID: \"b4eb7061-dde4-44f1-943a-219d2f4f5071\") " pod="openshift-marketplace/marketplace-operator-79b997595-7fqvf" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.294106 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.294121 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c7b96e7-c74b-4608-a81f-92a7d977c7d9-service-ca-bundle\") pod \"router-default-5444994796-5jvq8\" (UID: \"0c7b96e7-c74b-4608-a81f-92a7d977c7d9\") " pod="openshift-ingress/router-default-5444994796-5jvq8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.294144 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.294160 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/31c7ed96-b981-4f85-9f5a-dee62216ecd9-signing-cabundle\") pod \"service-ca-9c57cc56f-vm7fz\" (UID: \"31c7ed96-b981-4f85-9f5a-dee62216ecd9\") " pod="openshift-service-ca/service-ca-9c57cc56f-vm7fz" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.294176 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b56fbd59-432d-4672-bc27-0cb80aa81405-images\") pod \"machine-config-operator-74547568cd-qvlnd\" (UID: \"b56fbd59-432d-4672-bc27-0cb80aa81405\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qvlnd" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.294199 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf-config\") pod \"etcd-operator-b45778765-8827r\" (UID: \"bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8827r" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.294215 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2db36f46-e19a-4b7d-a94f-157f65671639-audit-policies\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.294234 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2db36f46-e19a-4b7d-a94f-157f65671639-audit-dir\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.294251 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg6bm\" (UniqueName: \"kubernetes.io/projected/5847be39-eb3d-4802-8f96-771f91078979-kube-api-access-vg6bm\") pod \"cluster-image-registry-operator-dc59b4c8b-fk4xm\" (UID: \"5847be39-eb3d-4802-8f96-771f91078979\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fk4xm" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.294267 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/31c7ed96-b981-4f85-9f5a-dee62216ecd9-signing-key\") pod \"service-ca-9c57cc56f-vm7fz\" (UID: \"31c7ed96-b981-4f85-9f5a-dee62216ecd9\") " pod="openshift-service-ca/service-ca-9c57cc56f-vm7fz" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.294288 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.294305 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b56fbd59-432d-4672-bc27-0cb80aa81405-proxy-tls\") pod \"machine-config-operator-74547568cd-qvlnd\" (UID: \"b56fbd59-432d-4672-bc27-0cb80aa81405\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qvlnd" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.294320 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b56fbd59-432d-4672-bc27-0cb80aa81405-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qvlnd\" (UID: \"b56fbd59-432d-4672-bc27-0cb80aa81405\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qvlnd" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.294339 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0c7b96e7-c74b-4608-a81f-92a7d977c7d9-default-certificate\") pod \"router-default-5444994796-5jvq8\" (UID: \"0c7b96e7-c74b-4608-a81f-92a7d977c7d9\") " pod="openshift-ingress/router-default-5444994796-5jvq8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.294366 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.294385 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppbbp\" (UniqueName: \"kubernetes.io/projected/2db36f46-e19a-4b7d-a94f-157f65671639-kube-api-access-ppbbp\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.294400 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph76z\" (UniqueName: \"kubernetes.io/projected/f790bc61-0a88-45de-b672-76b356fb8522-kube-api-access-ph76z\") pod \"olm-operator-6b444d44fb-lxmpv\" (UID: \"f790bc61-0a88-45de-b672-76b356fb8522\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lxmpv" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.294420 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ncgg\" (UniqueName: \"kubernetes.io/projected/db1585ab-0970-4bd5-acba-7eda8ed2d40f-kube-api-access-5ncgg\") pod \"cluster-samples-operator-665b6dd947-fs46x\" (UID: \"db1585ab-0970-4bd5-acba-7eda8ed2d40f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fs46x" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.294439 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf6tm\" (UniqueName: \"kubernetes.io/projected/adf88882-bc11-4ee1-a2ba-cfd13d62b8dd-kube-api-access-bf6tm\") pod \"migrator-59844c95c7-2f87m\" (UID: \"adf88882-bc11-4ee1-a2ba-cfd13d62b8dd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2f87m" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.294458 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf-etcd-client\") pod \"etcd-operator-b45778765-8827r\" (UID: \"bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8827r" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.294473 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6n62\" (UniqueName: \"kubernetes.io/projected/0c7b96e7-c74b-4608-a81f-92a7d977c7d9-kube-api-access-x6n62\") pod \"router-default-5444994796-5jvq8\" (UID: \"0c7b96e7-c74b-4608-a81f-92a7d977c7d9\") " pod="openshift-ingress/router-default-5444994796-5jvq8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.295345 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2db36f46-e19a-4b7d-a94f-157f65671639-audit-dir\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.296493 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.297285 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf-config\") pod \"etcd-operator-b45778765-8827r\" (UID: \"bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8827r" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.298058 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf-etcd-service-ca\") pod \"etcd-operator-b45778765-8827r\" (UID: \"bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8827r" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.298844 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.300379 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2db36f46-e19a-4b7d-a94f-157f65671639-audit-policies\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.301845 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.302536 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5847be39-eb3d-4802-8f96-771f91078979-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fk4xm\" (UID: \"5847be39-eb3d-4802-8f96-771f91078979\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fk4xm" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.304500 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5847be39-eb3d-4802-8f96-771f91078979-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fk4xm\" (UID: \"5847be39-eb3d-4802-8f96-771f91078979\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fk4xm" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.305154 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf-etcd-ca\") pod \"etcd-operator-b45778765-8827r\" (UID: \"bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8827r" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.305597 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/db1585ab-0970-4bd5-acba-7eda8ed2d40f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fs46x\" (UID: \"db1585ab-0970-4bd5-acba-7eda8ed2d40f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fs46x" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.305847 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.306411 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c09ae4bc-4073-442f-8cbc-0f42ea00a35d-trusted-ca\") pod \"ingress-operator-5b745b69d9-9sf6z\" (UID: \"c09ae4bc-4073-442f-8cbc-0f42ea00a35d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9sf6z" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.307319 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.307512 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8796bcce-0185-4494-8485-66476bdde45c-metrics-tls\") pod \"dns-operator-744455d44c-fvhqg\" (UID: \"8796bcce-0185-4494-8485-66476bdde45c\") " pod="openshift-dns-operator/dns-operator-744455d44c-fvhqg" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.308546 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.309328 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf-serving-cert\") pod \"etcd-operator-b45778765-8827r\" (UID: \"bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8827r" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.309673 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c09ae4bc-4073-442f-8cbc-0f42ea00a35d-metrics-tls\") pod \"ingress-operator-5b745b69d9-9sf6z\" (UID: \"c09ae4bc-4073-442f-8cbc-0f42ea00a35d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9sf6z" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.316614 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.316789 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf-etcd-client\") pod \"etcd-operator-b45778765-8827r\" (UID: \"bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8827r" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.316845 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.318154 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.319197 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.323915 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.323944 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.326090 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.346051 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.352102 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a212fe68-9219-479d-bf36-26b08daf31ab-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vdfcr\" (UID: \"a212fe68-9219-479d-bf36-26b08daf31ab\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vdfcr" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.368396 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.378970 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a212fe68-9219-479d-bf36-26b08daf31ab-config\") pod \"kube-apiserver-operator-766d6c64bb-vdfcr\" (UID: \"a212fe68-9219-479d-bf36-26b08daf31ab\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vdfcr" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.389774 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.398623 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b4eb7061-dde4-44f1-943a-219d2f4f5071-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7fqvf\" (UID: \"b4eb7061-dde4-44f1-943a-219d2f4f5071\") " pod="openshift-marketplace/marketplace-operator-79b997595-7fqvf" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.398669 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c7b96e7-c74b-4608-a81f-92a7d977c7d9-service-ca-bundle\") pod \"router-default-5444994796-5jvq8\" (UID: \"0c7b96e7-c74b-4608-a81f-92a7d977c7d9\") " pod="openshift-ingress/router-default-5444994796-5jvq8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.398713 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/31c7ed96-b981-4f85-9f5a-dee62216ecd9-signing-cabundle\") pod \"service-ca-9c57cc56f-vm7fz\" (UID: \"31c7ed96-b981-4f85-9f5a-dee62216ecd9\") " pod="openshift-service-ca/service-ca-9c57cc56f-vm7fz" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.398737 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b56fbd59-432d-4672-bc27-0cb80aa81405-images\") pod \"machine-config-operator-74547568cd-qvlnd\" (UID: \"b56fbd59-432d-4672-bc27-0cb80aa81405\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qvlnd" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.398780 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/31c7ed96-b981-4f85-9f5a-dee62216ecd9-signing-key\") pod \"service-ca-9c57cc56f-vm7fz\" (UID: \"31c7ed96-b981-4f85-9f5a-dee62216ecd9\") " pod="openshift-service-ca/service-ca-9c57cc56f-vm7fz" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.398809 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b56fbd59-432d-4672-bc27-0cb80aa81405-proxy-tls\") pod \"machine-config-operator-74547568cd-qvlnd\" (UID: \"b56fbd59-432d-4672-bc27-0cb80aa81405\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qvlnd" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.398833 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b56fbd59-432d-4672-bc27-0cb80aa81405-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qvlnd\" (UID: \"b56fbd59-432d-4672-bc27-0cb80aa81405\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qvlnd" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.398855 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0c7b96e7-c74b-4608-a81f-92a7d977c7d9-default-certificate\") pod \"router-default-5444994796-5jvq8\" (UID: \"0c7b96e7-c74b-4608-a81f-92a7d977c7d9\") " pod="openshift-ingress/router-default-5444994796-5jvq8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.398893 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph76z\" (UniqueName: \"kubernetes.io/projected/f790bc61-0a88-45de-b672-76b356fb8522-kube-api-access-ph76z\") pod \"olm-operator-6b444d44fb-lxmpv\" (UID: \"f790bc61-0a88-45de-b672-76b356fb8522\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lxmpv" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.398923 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf6tm\" (UniqueName: \"kubernetes.io/projected/adf88882-bc11-4ee1-a2ba-cfd13d62b8dd-kube-api-access-bf6tm\") pod \"migrator-59844c95c7-2f87m\" (UID: \"adf88882-bc11-4ee1-a2ba-cfd13d62b8dd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2f87m" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.398952 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6n62\" (UniqueName: \"kubernetes.io/projected/0c7b96e7-c74b-4608-a81f-92a7d977c7d9-kube-api-access-x6n62\") pod \"router-default-5444994796-5jvq8\" (UID: \"0c7b96e7-c74b-4608-a81f-92a7d977c7d9\") " pod="openshift-ingress/router-default-5444994796-5jvq8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.398974 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0c7b96e7-c74b-4608-a81f-92a7d977c7d9-stats-auth\") pod \"router-default-5444994796-5jvq8\" (UID: \"0c7b96e7-c74b-4608-a81f-92a7d977c7d9\") " pod="openshift-ingress/router-default-5444994796-5jvq8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.399092 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9srr\" (UniqueName: \"kubernetes.io/projected/b56fbd59-432d-4672-bc27-0cb80aa81405-kube-api-access-l9srr\") pod \"machine-config-operator-74547568cd-qvlnd\" (UID: \"b56fbd59-432d-4672-bc27-0cb80aa81405\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qvlnd" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.399136 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjfnw\" (UniqueName: \"kubernetes.io/projected/31c7ed96-b981-4f85-9f5a-dee62216ecd9-kube-api-access-fjfnw\") pod \"service-ca-9c57cc56f-vm7fz\" (UID: \"31c7ed96-b981-4f85-9f5a-dee62216ecd9\") " pod="openshift-service-ca/service-ca-9c57cc56f-vm7fz" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.399180 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c7b96e7-c74b-4608-a81f-92a7d977c7d9-metrics-certs\") pod \"router-default-5444994796-5jvq8\" (UID: \"0c7b96e7-c74b-4608-a81f-92a7d977c7d9\") " pod="openshift-ingress/router-default-5444994796-5jvq8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.399203 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f790bc61-0a88-45de-b672-76b356fb8522-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lxmpv\" (UID: \"f790bc61-0a88-45de-b672-76b356fb8522\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lxmpv" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.399231 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gprh\" (UniqueName: \"kubernetes.io/projected/b4eb7061-dde4-44f1-943a-219d2f4f5071-kube-api-access-7gprh\") pod \"marketplace-operator-79b997595-7fqvf\" (UID: \"b4eb7061-dde4-44f1-943a-219d2f4f5071\") " pod="openshift-marketplace/marketplace-operator-79b997595-7fqvf" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.399256 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f790bc61-0a88-45de-b672-76b356fb8522-srv-cert\") pod \"olm-operator-6b444d44fb-lxmpv\" (UID: \"f790bc61-0a88-45de-b672-76b356fb8522\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lxmpv" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.399316 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4eb7061-dde4-44f1-943a-219d2f4f5071-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7fqvf\" (UID: \"b4eb7061-dde4-44f1-943a-219d2f4f5071\") " pod="openshift-marketplace/marketplace-operator-79b997595-7fqvf" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.400617 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b56fbd59-432d-4672-bc27-0cb80aa81405-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qvlnd\" (UID: \"b56fbd59-432d-4672-bc27-0cb80aa81405\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qvlnd" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.408315 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.425753 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.446432 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.465533 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.486922 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.500284 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs\") pod \"network-metrics-daemon-zjhnk\" (UID: \"7fb3bb21-b72b-45e1-9b87-73f281abba90\") " pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.506687 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.526968 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.546326 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.566641 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.606603 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.626566 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.646873 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.667119 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.686926 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.705450 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.728450 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.746109 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.766147 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.786558 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.796389 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0c7b96e7-c74b-4608-a81f-92a7d977c7d9-default-certificate\") pod \"router-default-5444994796-5jvq8\" (UID: \"0c7b96e7-c74b-4608-a81f-92a7d977c7d9\") " pod="openshift-ingress/router-default-5444994796-5jvq8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.807014 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.814826 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0c7b96e7-c74b-4608-a81f-92a7d977c7d9-stats-auth\") pod \"router-default-5444994796-5jvq8\" (UID: \"0c7b96e7-c74b-4608-a81f-92a7d977c7d9\") " pod="openshift-ingress/router-default-5444994796-5jvq8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.826341 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.834726 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c7b96e7-c74b-4608-a81f-92a7d977c7d9-metrics-certs\") pod \"router-default-5444994796-5jvq8\" (UID: \"0c7b96e7-c74b-4608-a81f-92a7d977c7d9\") " pod="openshift-ingress/router-default-5444994796-5jvq8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.845547 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.851051 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c7b96e7-c74b-4608-a81f-92a7d977c7d9-service-ca-bundle\") pod \"router-default-5444994796-5jvq8\" (UID: \"0c7b96e7-c74b-4608-a81f-92a7d977c7d9\") " pod="openshift-ingress/router-default-5444994796-5jvq8" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.866920 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.885764 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.906236 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.926051 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.946140 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.966188 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 15:19:43 crc kubenswrapper[4771]: I0319 15:19:43.986210 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.006828 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.012855 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b4eb7061-dde4-44f1-943a-219d2f4f5071-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7fqvf\" (UID: \"b4eb7061-dde4-44f1-943a-219d2f4f5071\") " pod="openshift-marketplace/marketplace-operator-79b997595-7fqvf" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.025676 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.054542 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.060848 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4eb7061-dde4-44f1-943a-219d2f4f5071-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7fqvf\" (UID: \"b4eb7061-dde4-44f1-943a-219d2f4f5071\") " pod="openshift-marketplace/marketplace-operator-79b997595-7fqvf" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.067502 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.075974 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/31c7ed96-b981-4f85-9f5a-dee62216ecd9-signing-key\") pod \"service-ca-9c57cc56f-vm7fz\" (UID: \"31c7ed96-b981-4f85-9f5a-dee62216ecd9\") " pod="openshift-service-ca/service-ca-9c57cc56f-vm7fz" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.086702 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.104041 4771 request.go:700] Waited for 1.002778564s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.105801 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.126185 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.131138 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/31c7ed96-b981-4f85-9f5a-dee62216ecd9-signing-cabundle\") pod \"service-ca-9c57cc56f-vm7fz\" (UID: \"31c7ed96-b981-4f85-9f5a-dee62216ecd9\") " pod="openshift-service-ca/service-ca-9c57cc56f-vm7fz" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.145532 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.166004 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.186864 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.206581 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.226740 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.246062 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.266087 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.287151 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.306743 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.326159 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.346313 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.366045 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.386766 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 15:19:44 crc kubenswrapper[4771]: E0319 15:19:44.399918 4771 secret.go:188] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 15:19:44 crc kubenswrapper[4771]: E0319 15:19:44.400050 4771 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/machine-config-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 19 15:19:44 crc kubenswrapper[4771]: E0319 15:19:44.400108 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b56fbd59-432d-4672-bc27-0cb80aa81405-proxy-tls podName:b56fbd59-432d-4672-bc27-0cb80aa81405 nodeName:}" failed. No retries permitted until 2026-03-19 15:19:44.90007011 +0000 UTC m=+244.128691352 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b56fbd59-432d-4672-bc27-0cb80aa81405-proxy-tls") pod "machine-config-operator-74547568cd-qvlnd" (UID: "b56fbd59-432d-4672-bc27-0cb80aa81405") : failed to sync secret cache: timed out waiting for the condition Mar 19 15:19:44 crc kubenswrapper[4771]: E0319 15:19:44.400147 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b56fbd59-432d-4672-bc27-0cb80aa81405-images podName:b56fbd59-432d-4672-bc27-0cb80aa81405 nodeName:}" failed. No retries permitted until 2026-03-19 15:19:44.900130272 +0000 UTC m=+244.128751514 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/b56fbd59-432d-4672-bc27-0cb80aa81405-images") pod "machine-config-operator-74547568cd-qvlnd" (UID: "b56fbd59-432d-4672-bc27-0cb80aa81405") : failed to sync configmap cache: timed out waiting for the condition Mar 19 15:19:44 crc kubenswrapper[4771]: E0319 15:19:44.402337 4771 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 15:19:44 crc kubenswrapper[4771]: E0319 15:19:44.402577 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f790bc61-0a88-45de-b672-76b356fb8522-profile-collector-cert podName:f790bc61-0a88-45de-b672-76b356fb8522 nodeName:}" failed. No retries permitted until 2026-03-19 15:19:44.902545792 +0000 UTC m=+244.131167034 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/f790bc61-0a88-45de-b672-76b356fb8522-profile-collector-cert") pod "olm-operator-6b444d44fb-lxmpv" (UID: "f790bc61-0a88-45de-b672-76b356fb8522") : failed to sync secret cache: timed out waiting for the condition Mar 19 15:19:44 crc kubenswrapper[4771]: E0319 15:19:44.402364 4771 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 15:19:44 crc kubenswrapper[4771]: E0319 15:19:44.402737 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f790bc61-0a88-45de-b672-76b356fb8522-srv-cert podName:f790bc61-0a88-45de-b672-76b356fb8522 nodeName:}" failed. No retries permitted until 2026-03-19 15:19:44.902710657 +0000 UTC m=+244.131331899 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/f790bc61-0a88-45de-b672-76b356fb8522-srv-cert") pod "olm-operator-6b444d44fb-lxmpv" (UID: "f790bc61-0a88-45de-b672-76b356fb8522") : failed to sync secret cache: timed out waiting for the condition Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.407075 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.426764 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.446914 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.468725 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.486035 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 15:19:44 crc kubenswrapper[4771]: E0319 15:19:44.501096 4771 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: failed to sync secret cache: timed out waiting for the condition Mar 19 15:19:44 crc kubenswrapper[4771]: E0319 15:19:44.501246 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs podName:7fb3bb21-b72b-45e1-9b87-73f281abba90 nodeName:}" failed. No retries permitted until 2026-03-19 15:21:46.501211494 +0000 UTC m=+365.729832736 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs") pod "network-metrics-daemon-zjhnk" (UID: "7fb3bb21-b72b-45e1-9b87-73f281abba90") : failed to sync secret cache: timed out waiting for the condition Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.506624 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.526187 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.546509 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.565783 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.586213 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.608116 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.626681 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.646115 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.666791 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.686738 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.705824 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.726821 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.746532 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.765952 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.785828 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.831684 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j564s\" (UniqueName: \"kubernetes.io/projected/b11452b4-794b-41a4-a700-0b541916c6ad-kube-api-access-j564s\") pod \"apiserver-7bbb656c7d-9qkhp\" (UID: \"b11452b4-794b-41a4-a700-0b541916c6ad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.851867 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4xcw\" (UniqueName: \"kubernetes.io/projected/6dc754c0-8f17-402b-9bd4-be033eb940ba-kube-api-access-c4xcw\") pod \"console-f9d7485db-h97xq\" (UID: \"6dc754c0-8f17-402b-9bd4-be033eb940ba\") " pod="openshift-console/console-f9d7485db-h97xq" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.864525 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzzgx\" (UniqueName: \"kubernetes.io/projected/b01a56e9-ee30-4945-b582-5ff927104c4c-kube-api-access-lzzgx\") pod \"controller-manager-879f6c89f-jpwl9\" (UID: \"b01a56e9-ee30-4945-b582-5ff927104c4c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jpwl9" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.881395 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.891092 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-454bf\" (UniqueName: \"kubernetes.io/projected/d94aad87-f9aa-4c6b-9846-509b3c6ad6b5-kube-api-access-454bf\") pod \"machine-approver-56656f9798-fjh9n\" (UID: \"d94aad87-f9aa-4c6b-9846-509b3c6ad6b5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjh9n" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.910934 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vpn5\" (UniqueName: \"kubernetes.io/projected/f5ae0ccc-a50b-46d1-b887-28840703ab87-kube-api-access-7vpn5\") pod \"machine-api-operator-5694c8668f-2mx6f\" (UID: \"f5ae0ccc-a50b-46d1-b887-28840703ab87\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2mx6f" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.922168 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcgxg\" (UniqueName: \"kubernetes.io/projected/d43df1e2-591a-43e2-a7e2-f48459125711-kube-api-access-xcgxg\") pod \"openshift-apiserver-operator-796bbdcf4f-gvngw\" (UID: \"d43df1e2-591a-43e2-a7e2-f48459125711\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gvngw" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.926439 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f790bc61-0a88-45de-b672-76b356fb8522-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lxmpv\" (UID: \"f790bc61-0a88-45de-b672-76b356fb8522\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lxmpv" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.926527 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f790bc61-0a88-45de-b672-76b356fb8522-srv-cert\") pod \"olm-operator-6b444d44fb-lxmpv\" (UID: \"f790bc61-0a88-45de-b672-76b356fb8522\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lxmpv" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.926670 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b56fbd59-432d-4672-bc27-0cb80aa81405-images\") pod \"machine-config-operator-74547568cd-qvlnd\" (UID: \"b56fbd59-432d-4672-bc27-0cb80aa81405\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qvlnd" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.926735 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b56fbd59-432d-4672-bc27-0cb80aa81405-proxy-tls\") pod \"machine-config-operator-74547568cd-qvlnd\" (UID: \"b56fbd59-432d-4672-bc27-0cb80aa81405\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qvlnd" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.926915 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-h97xq" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.928513 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b56fbd59-432d-4672-bc27-0cb80aa81405-images\") pod \"machine-config-operator-74547568cd-qvlnd\" (UID: \"b56fbd59-432d-4672-bc27-0cb80aa81405\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qvlnd" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.936145 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjh9n" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.939942 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f790bc61-0a88-45de-b672-76b356fb8522-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lxmpv\" (UID: \"f790bc61-0a88-45de-b672-76b356fb8522\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lxmpv" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.940111 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b56fbd59-432d-4672-bc27-0cb80aa81405-proxy-tls\") pod \"machine-config-operator-74547568cd-qvlnd\" (UID: \"b56fbd59-432d-4672-bc27-0cb80aa81405\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qvlnd" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.940338 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f790bc61-0a88-45de-b672-76b356fb8522-srv-cert\") pod \"olm-operator-6b444d44fb-lxmpv\" (UID: \"f790bc61-0a88-45de-b672-76b356fb8522\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lxmpv" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.943596 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t76v\" (UniqueName: \"kubernetes.io/projected/77444ace-be14-4606-898d-565c52bec7b0-kube-api-access-8t76v\") pod \"openshift-controller-manager-operator-756b6f6bc6-nwdj8\" (UID: \"77444ace-be14-4606-898d-565c52bec7b0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwdj8" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.947297 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jpwl9" Mar 19 15:19:44 crc kubenswrapper[4771]: W0319 15:19:44.952148 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd94aad87_f9aa_4c6b_9846_509b3c6ad6b5.slice/crio-eea6af3dcbcc98b74ef2a5e9ee80d2eefd9660644c04740b65a75e863b72719a WatchSource:0}: Error finding container eea6af3dcbcc98b74ef2a5e9ee80d2eefd9660644c04740b65a75e863b72719a: Status 404 returned error can't find the container with id eea6af3dcbcc98b74ef2a5e9ee80d2eefd9660644c04740b65a75e863b72719a Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.967225 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwdj8" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.977184 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp5v6\" (UniqueName: \"kubernetes.io/projected/91fbe793-06cf-41b4-b24b-a57657dc05f2-kube-api-access-xp5v6\") pod \"authentication-operator-69f744f599-sl682\" (UID: \"91fbe793-06cf-41b4-b24b-a57657dc05f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sl682" Mar 19 15:19:44 crc kubenswrapper[4771]: I0319 15:19:44.983214 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpx66\" (UniqueName: \"kubernetes.io/projected/8169e27a-4573-4d03-b3e8-0c072a2efbe7-kube-api-access-dpx66\") pod \"console-operator-58897d9998-wwpj8\" (UID: \"8169e27a-4573-4d03-b3e8-0c072a2efbe7\") " pod="openshift-console-operator/console-operator-58897d9998-wwpj8" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.008954 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfkm6\" (UniqueName: \"kubernetes.io/projected/93121b3f-c7da-4e21-8fa0-7f8bf08a05ef-kube-api-access-hfkm6\") pod \"route-controller-manager-6576b87f9c-hmvx4\" (UID: \"93121b3f-c7da-4e21-8fa0-7f8bf08a05ef\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.045388 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2426c\" (UniqueName: \"kubernetes.io/projected/1c75a371-7547-47b8-ac59-d296d642cd5c-kube-api-access-2426c\") pod \"apiserver-76f77b778f-rn8sc\" (UID: \"1c75a371-7547-47b8-ac59-d296d642cd5c\") " pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.047196 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.057859 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.066600 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.075548 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gvngw" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.085777 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.104239 4771 request.go:700] Waited for 1.8778317s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Ddefault-dockercfg-2llfx&limit=500&resourceVersion=0 Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.106221 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.108401 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-2mx6f" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.119246 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.125714 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.147910 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.153899 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjh9n" event={"ID":"d94aad87-f9aa-4c6b-9846-509b3c6ad6b5","Type":"ContainerStarted","Data":"eea6af3dcbcc98b74ef2a5e9ee80d2eefd9660644c04740b65a75e863b72719a"} Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.169306 4771 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.173266 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp"] Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.186409 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.192168 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-sl682" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.205739 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.213583 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wwpj8" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.227721 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.246365 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-h97xq"] Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.255402 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwdj8"] Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.265036 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdzmn\" (UniqueName: \"kubernetes.io/projected/6f5dca39-298b-4814-b77c-43dd0cbc4025-kube-api-access-hdzmn\") pod \"downloads-7954f5f757-m25b8\" (UID: \"6f5dca39-298b-4814-b77c-43dd0cbc4025\") " pod="openshift-console/downloads-7954f5f757-m25b8" Mar 19 15:19:45 crc kubenswrapper[4771]: W0319 15:19:45.274777 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77444ace_be14_4606_898d_565c52bec7b0.slice/crio-5af54543290525a122b93da00cd070fef81d95ae55c0890d0ed18464dda24088 WatchSource:0}: Error finding container 5af54543290525a122b93da00cd070fef81d95ae55c0890d0ed18464dda24088: Status 404 returned error can't find the container with id 5af54543290525a122b93da00cd070fef81d95ae55c0890d0ed18464dda24088 Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.287326 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf7k9\" (UniqueName: \"kubernetes.io/projected/c09ae4bc-4073-442f-8cbc-0f42ea00a35d-kube-api-access-qf7k9\") pod \"ingress-operator-5b745b69d9-9sf6z\" (UID: \"c09ae4bc-4073-442f-8cbc-0f42ea00a35d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9sf6z" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.303496 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vrpz\" (UniqueName: \"kubernetes.io/projected/bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf-kube-api-access-7vrpz\") pod \"etcd-operator-b45778765-8827r\" (UID: \"bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8827r" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.322545 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg6bm\" (UniqueName: \"kubernetes.io/projected/5847be39-eb3d-4802-8f96-771f91078979-kube-api-access-vg6bm\") pod \"cluster-image-registry-operator-dc59b4c8b-fk4xm\" (UID: \"5847be39-eb3d-4802-8f96-771f91078979\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fk4xm" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.331360 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-m25b8" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.340023 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ncgg\" (UniqueName: \"kubernetes.io/projected/db1585ab-0970-4bd5-acba-7eda8ed2d40f-kube-api-access-5ncgg\") pod \"cluster-samples-operator-665b6dd947-fs46x\" (UID: \"db1585ab-0970-4bd5-acba-7eda8ed2d40f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fs46x" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.362076 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppbbp\" (UniqueName: \"kubernetes.io/projected/2db36f46-e19a-4b7d-a94f-157f65671639-kube-api-access-ppbbp\") pod \"oauth-openshift-558db77b4-qj9qq\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.363388 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fs46x" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.381187 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a212fe68-9219-479d-bf36-26b08daf31ab-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vdfcr\" (UID: \"a212fe68-9219-479d-bf36-26b08daf31ab\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vdfcr" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.383247 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.402072 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5847be39-eb3d-4802-8f96-771f91078979-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fk4xm\" (UID: \"5847be39-eb3d-4802-8f96-771f91078979\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fk4xm" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.422707 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cnpf\" (UniqueName: \"kubernetes.io/projected/8796bcce-0185-4494-8485-66476bdde45c-kube-api-access-9cnpf\") pod \"dns-operator-744455d44c-fvhqg\" (UID: \"8796bcce-0185-4494-8485-66476bdde45c\") " pod="openshift-dns-operator/dns-operator-744455d44c-fvhqg" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.436482 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jpwl9"] Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.442354 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c09ae4bc-4073-442f-8cbc-0f42ea00a35d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9sf6z\" (UID: \"c09ae4bc-4073-442f-8cbc-0f42ea00a35d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9sf6z" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.472647 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph76z\" (UniqueName: \"kubernetes.io/projected/f790bc61-0a88-45de-b672-76b356fb8522-kube-api-access-ph76z\") pod \"olm-operator-6b444d44fb-lxmpv\" (UID: \"f790bc61-0a88-45de-b672-76b356fb8522\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lxmpv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.481521 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf6tm\" (UniqueName: \"kubernetes.io/projected/adf88882-bc11-4ee1-a2ba-cfd13d62b8dd-kube-api-access-bf6tm\") pod \"migrator-59844c95c7-2f87m\" (UID: \"adf88882-bc11-4ee1-a2ba-cfd13d62b8dd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2f87m" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.487728 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rn8sc"] Mar 19 15:19:45 crc kubenswrapper[4771]: W0319 15:19:45.493122 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb01a56e9_ee30_4945_b582_5ff927104c4c.slice/crio-8f2f0eaf83d2fe1eb66fb2f852ce2a9273643fe22ab57c87f1e36115962174db WatchSource:0}: Error finding container 8f2f0eaf83d2fe1eb66fb2f852ce2a9273643fe22ab57c87f1e36115962174db: Status 404 returned error can't find the container with id 8f2f0eaf83d2fe1eb66fb2f852ce2a9273643fe22ab57c87f1e36115962174db Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.504000 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6n62\" (UniqueName: \"kubernetes.io/projected/0c7b96e7-c74b-4608-a81f-92a7d977c7d9-kube-api-access-x6n62\") pod \"router-default-5444994796-5jvq8\" (UID: \"0c7b96e7-c74b-4608-a81f-92a7d977c7d9\") " pod="openshift-ingress/router-default-5444994796-5jvq8" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.528357 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9srr\" (UniqueName: \"kubernetes.io/projected/b56fbd59-432d-4672-bc27-0cb80aa81405-kube-api-access-l9srr\") pod \"machine-config-operator-74547568cd-qvlnd\" (UID: \"b56fbd59-432d-4672-bc27-0cb80aa81405\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qvlnd" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.540046 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2f87m" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.555470 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjfnw\" (UniqueName: \"kubernetes.io/projected/31c7ed96-b981-4f85-9f5a-dee62216ecd9-kube-api-access-fjfnw\") pod \"service-ca-9c57cc56f-vm7fz\" (UID: \"31c7ed96-b981-4f85-9f5a-dee62216ecd9\") " pod="openshift-service-ca/service-ca-9c57cc56f-vm7fz" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.560620 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4"] Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.561577 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qvlnd" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.563772 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gprh\" (UniqueName: \"kubernetes.io/projected/b4eb7061-dde4-44f1-943a-219d2f4f5071-kube-api-access-7gprh\") pod \"marketplace-operator-79b997595-7fqvf\" (UID: \"b4eb7061-dde4-44f1-943a-219d2f4f5071\") " pod="openshift-marketplace/marketplace-operator-79b997595-7fqvf" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.571231 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.578332 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-8827r" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.584706 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wwpj8"] Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.590036 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-2mx6f"] Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.592795 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gvngw"] Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.614674 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-sl682"] Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.614870 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lxmpv" Mar 19 15:19:45 crc kubenswrapper[4771]: W0319 15:19:45.619465 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5ae0ccc_a50b_46d1_b887_28840703ab87.slice/crio-199fb7d96c8a2f7cb2d56fcaed4bc9fd30ec226821c0733ca4c8a1dfd192ee66 WatchSource:0}: Error finding container 199fb7d96c8a2f7cb2d56fcaed4bc9fd30ec226821c0733ca4c8a1dfd192ee66: Status 404 returned error can't find the container with id 199fb7d96c8a2f7cb2d56fcaed4bc9fd30ec226821c0733ca4c8a1dfd192ee66 Mar 19 15:19:45 crc kubenswrapper[4771]: W0319 15:19:45.624306 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd43df1e2_591a_43e2_a7e2_f48459125711.slice/crio-f5b6053598c28ee322cd2dedc8ab39f0202c014a152cb153db484036e93bc163 WatchSource:0}: Error finding container f5b6053598c28ee322cd2dedc8ab39f0202c014a152cb153db484036e93bc163: Status 404 returned error can't find the container with id f5b6053598c28ee322cd2dedc8ab39f0202c014a152cb153db484036e93bc163 Mar 19 15:19:45 crc kubenswrapper[4771]: W0319 15:19:45.634589 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91fbe793_06cf_41b4_b24b_a57657dc05f2.slice/crio-c2c09af186a62233a1fcb7f0f36f435065ced00a01b4d078189c070e00ba1706 WatchSource:0}: Error finding container c2c09af186a62233a1fcb7f0f36f435065ced00a01b4d078189c070e00ba1706: Status 404 returned error can't find the container with id c2c09af186a62233a1fcb7f0f36f435065ced00a01b4d078189c070e00ba1706 Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.639850 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9sf6z" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.642631 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xlfm\" (UniqueName: \"kubernetes.io/projected/84955373-f1a6-473c-8b85-2f8d4dc29256-kube-api-access-9xlfm\") pod \"packageserver-d55dfcdfc-q7b76\" (UID: \"84955373-f1a6-473c-8b85-2f8d4dc29256\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b76" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.642709 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f6015a9-593b-4a7f-a16b-dc415bba5374-serving-cert\") pod \"service-ca-operator-777779d784-wrsj5\" (UID: \"0f6015a9-593b-4a7f-a16b-dc415bba5374\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wrsj5" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.642783 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/47dd3e68-6697-43ba-b530-61447a1fe1e7-node-bootstrap-token\") pod \"machine-config-server-wbmz4\" (UID: \"47dd3e68-6697-43ba-b530-61447a1fe1e7\") " pod="openshift-machine-config-operator/machine-config-server-wbmz4" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.642806 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f3dd2372-7717-4dd5-9812-60a628b02dda-profile-collector-cert\") pod \"catalog-operator-68c6474976-54ctz\" (UID: \"f3dd2372-7717-4dd5-9812-60a628b02dda\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-54ctz" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.642833 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8d55d3e9-4387-4456-814b-34317b8768f5-proxy-tls\") pod \"machine-config-controller-84d6567774-2vr9c\" (UID: \"8d55d3e9-4387-4456-814b-34317b8768f5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2vr9c" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.642851 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40ac0b95-eb22-4b20-8870-b440d5fa12d1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lm2zv\" (UID: \"40ac0b95-eb22-4b20-8870-b440d5fa12d1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lm2zv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.642866 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct48p\" (UniqueName: \"kubernetes.io/projected/98e7c37e-276a-4fae-aa3a-856f6c33e608-kube-api-access-ct48p\") pod \"multus-admission-controller-857f4d67dd-k96m2\" (UID: \"98e7c37e-276a-4fae-aa3a-856f6c33e608\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k96m2" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.642903 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjb4j\" (UniqueName: \"kubernetes.io/projected/af3ce0f9-bc02-4142-8655-9751fe9197db-kube-api-access-vjb4j\") pod \"auto-csr-approver-29565558-wvlb8\" (UID: \"af3ce0f9-bc02-4142-8655-9751fe9197db\") " pod="openshift-infra/auto-csr-approver-29565558-wvlb8" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.642931 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crx9f\" (UniqueName: \"kubernetes.io/projected/0f6015a9-593b-4a7f-a16b-dc415bba5374-kube-api-access-crx9f\") pod \"service-ca-operator-777779d784-wrsj5\" (UID: \"0f6015a9-593b-4a7f-a16b-dc415bba5374\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wrsj5" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.642971 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53b661a1-d7da-45eb-9861-dca1509920ee-serving-cert\") pod \"openshift-config-operator-7777fb866f-wm59s\" (UID: \"53b661a1-d7da-45eb-9861-dca1509920ee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wm59s" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.643008 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/53b661a1-d7da-45eb-9861-dca1509920ee-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wm59s\" (UID: \"53b661a1-d7da-45eb-9861-dca1509920ee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wm59s" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.643036 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2f99f52-00ff-42f0-a2ee-122235c86b2b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.643069 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9687355-25ac-44eb-ab53-25419b1c24b6-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zpxp6\" (UID: \"d9687355-25ac-44eb-ab53-25419b1c24b6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zpxp6" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.643099 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98e7c37e-276a-4fae-aa3a-856f6c33e608-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-k96m2\" (UID: \"98e7c37e-276a-4fae-aa3a-856f6c33e608\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k96m2" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.643131 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/259b7e23-88c9-452d-a549-a0ccbfacbcb5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-l5rcq\" (UID: \"259b7e23-88c9-452d-a549-a0ccbfacbcb5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l5rcq" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.643151 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d9687355-25ac-44eb-ab53-25419b1c24b6-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zpxp6\" (UID: \"d9687355-25ac-44eb-ab53-25419b1c24b6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zpxp6" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.643172 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f6015a9-593b-4a7f-a16b-dc415bba5374-config\") pod \"service-ca-operator-777779d784-wrsj5\" (UID: \"0f6015a9-593b-4a7f-a16b-dc415bba5374\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wrsj5" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.643285 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dspn2\" (UniqueName: \"kubernetes.io/projected/6e1420e8-ae3b-4ef3-98e3-f2b63eb032e1-kube-api-access-dspn2\") pod \"package-server-manager-789f6589d5-rdqrx\" (UID: \"6e1420e8-ae3b-4ef3-98e3-f2b63eb032e1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rdqrx" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.643322 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/84955373-f1a6-473c-8b85-2f8d4dc29256-apiservice-cert\") pod \"packageserver-d55dfcdfc-q7b76\" (UID: \"84955373-f1a6-473c-8b85-2f8d4dc29256\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b76" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.643808 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/47dd3e68-6697-43ba-b530-61447a1fe1e7-certs\") pod \"machine-config-server-wbmz4\" (UID: \"47dd3e68-6697-43ba-b530-61447a1fe1e7\") " pod="openshift-machine-config-operator/machine-config-server-wbmz4" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.643842 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzgl5\" (UniqueName: \"kubernetes.io/projected/6c6abe54-14f8-429a-9833-9492e274ec41-kube-api-access-gzgl5\") pod \"control-plane-machine-set-operator-78cbb6b69f-tzhdc\" (UID: \"6c6abe54-14f8-429a-9833-9492e274ec41\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tzhdc" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.644327 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2f99f52-00ff-42f0-a2ee-122235c86b2b-registry-tls\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.644566 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2f99f52-00ff-42f0-a2ee-122235c86b2b-trusted-ca\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.644589 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/84955373-f1a6-473c-8b85-2f8d4dc29256-tmpfs\") pod \"packageserver-d55dfcdfc-q7b76\" (UID: \"84955373-f1a6-473c-8b85-2f8d4dc29256\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b76" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.644657 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22kpc\" (UniqueName: \"kubernetes.io/projected/8d55d3e9-4387-4456-814b-34317b8768f5-kube-api-access-22kpc\") pod \"machine-config-controller-84d6567774-2vr9c\" (UID: \"8d55d3e9-4387-4456-814b-34317b8768f5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2vr9c" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.644679 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c846868-7bc7-4aca-b36c-b8e85cc31ac2-secret-volume\") pod \"collect-profiles-29565555-5tnbp\" (UID: \"7c846868-7bc7-4aca-b36c-b8e85cc31ac2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565555-5tnbp" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.644752 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhh5g\" (UniqueName: \"kubernetes.io/projected/259b7e23-88c9-452d-a549-a0ccbfacbcb5-kube-api-access-dhh5g\") pod \"kube-storage-version-migrator-operator-b67b599dd-l5rcq\" (UID: \"259b7e23-88c9-452d-a549-a0ccbfacbcb5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l5rcq" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.644788 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dgpr\" (UniqueName: \"kubernetes.io/projected/7c846868-7bc7-4aca-b36c-b8e85cc31ac2-kube-api-access-9dgpr\") pod \"collect-profiles-29565555-5tnbp\" (UID: \"7c846868-7bc7-4aca-b36c-b8e85cc31ac2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565555-5tnbp" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.644906 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2f99f52-00ff-42f0-a2ee-122235c86b2b-bound-sa-token\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.644979 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9687355-25ac-44eb-ab53-25419b1c24b6-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zpxp6\" (UID: \"d9687355-25ac-44eb-ab53-25419b1c24b6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zpxp6" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.645069 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8d55d3e9-4387-4456-814b-34317b8768f5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2vr9c\" (UID: \"8d55d3e9-4387-4456-814b-34317b8768f5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2vr9c" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.645092 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2f99f52-00ff-42f0-a2ee-122235c86b2b-registry-certificates\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.645125 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp4n5\" (UniqueName: \"kubernetes.io/projected/f3dd2372-7717-4dd5-9812-60a628b02dda-kube-api-access-rp4n5\") pod \"catalog-operator-68c6474976-54ctz\" (UID: \"f3dd2372-7717-4dd5-9812-60a628b02dda\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-54ctz" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.645413 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s28dr\" (UniqueName: \"kubernetes.io/projected/53b661a1-d7da-45eb-9861-dca1509920ee-kube-api-access-s28dr\") pod \"openshift-config-operator-7777fb866f-wm59s\" (UID: \"53b661a1-d7da-45eb-9861-dca1509920ee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wm59s" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.646039 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40ac0b95-eb22-4b20-8870-b440d5fa12d1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lm2zv\" (UID: \"40ac0b95-eb22-4b20-8870-b440d5fa12d1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lm2zv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.646073 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/84955373-f1a6-473c-8b85-2f8d4dc29256-webhook-cert\") pod \"packageserver-d55dfcdfc-q7b76\" (UID: \"84955373-f1a6-473c-8b85-2f8d4dc29256\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b76" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.648774 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fk4xm" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.650131 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e1420e8-ae3b-4ef3-98e3-f2b63eb032e1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rdqrx\" (UID: \"6e1420e8-ae3b-4ef3-98e3-f2b63eb032e1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rdqrx" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.650210 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/259b7e23-88c9-452d-a549-a0ccbfacbcb5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-l5rcq\" (UID: \"259b7e23-88c9-452d-a549-a0ccbfacbcb5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l5rcq" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.650244 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c846868-7bc7-4aca-b36c-b8e85cc31ac2-config-volume\") pod \"collect-profiles-29565555-5tnbp\" (UID: \"7c846868-7bc7-4aca-b36c-b8e85cc31ac2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565555-5tnbp" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.650315 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f3dd2372-7717-4dd5-9812-60a628b02dda-srv-cert\") pod \"catalog-operator-68c6474976-54ctz\" (UID: \"f3dd2372-7717-4dd5-9812-60a628b02dda\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-54ctz" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.650388 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5f9v\" (UniqueName: \"kubernetes.io/projected/e2f99f52-00ff-42f0-a2ee-122235c86b2b-kube-api-access-k5f9v\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.650458 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2f99f52-00ff-42f0-a2ee-122235c86b2b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.650480 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40ac0b95-eb22-4b20-8870-b440d5fa12d1-config\") pod \"kube-controller-manager-operator-78b949d7b-lm2zv\" (UID: \"40ac0b95-eb22-4b20-8870-b440d5fa12d1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lm2zv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.650566 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.650644 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c6abe54-14f8-429a-9833-9492e274ec41-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-tzhdc\" (UID: \"6c6abe54-14f8-429a-9833-9492e274ec41\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tzhdc" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.650669 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnc8s\" (UniqueName: \"kubernetes.io/projected/47dd3e68-6697-43ba-b530-61447a1fe1e7-kube-api-access-lnc8s\") pod \"machine-config-server-wbmz4\" (UID: \"47dd3e68-6697-43ba-b530-61447a1fe1e7\") " pod="openshift-machine-config-operator/machine-config-server-wbmz4" Mar 19 15:19:45 crc kubenswrapper[4771]: E0319 15:19:45.651103 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:46.151090168 +0000 UTC m=+245.379711370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.659771 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-fvhqg" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.670772 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vdfcr" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.712452 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fs46x"] Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.737201 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-5jvq8" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.751729 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.751998 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d16bbb1e-ba55-4380-a503-e67f57dae69a-cert\") pod \"ingress-canary-ngxwv\" (UID: \"d16bbb1e-ba55-4380-a503-e67f57dae69a\") " pod="openshift-ingress-canary/ingress-canary-ngxwv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752038 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2f99f52-00ff-42f0-a2ee-122235c86b2b-bound-sa-token\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752082 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9687355-25ac-44eb-ab53-25419b1c24b6-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zpxp6\" (UID: \"d9687355-25ac-44eb-ab53-25419b1c24b6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zpxp6" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752104 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a0231568-cb75-4c4a-be45-e07f0a03c320-registration-dir\") pod \"csi-hostpathplugin-bwgwn\" (UID: \"a0231568-cb75-4c4a-be45-e07f0a03c320\") " pod="hostpath-provisioner/csi-hostpathplugin-bwgwn" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752125 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8d55d3e9-4387-4456-814b-34317b8768f5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2vr9c\" (UID: \"8d55d3e9-4387-4456-814b-34317b8768f5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2vr9c" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752145 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2f99f52-00ff-42f0-a2ee-122235c86b2b-registry-certificates\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752162 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp4n5\" (UniqueName: \"kubernetes.io/projected/f3dd2372-7717-4dd5-9812-60a628b02dda-kube-api-access-rp4n5\") pod \"catalog-operator-68c6474976-54ctz\" (UID: \"f3dd2372-7717-4dd5-9812-60a628b02dda\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-54ctz" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752206 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s28dr\" (UniqueName: \"kubernetes.io/projected/53b661a1-d7da-45eb-9861-dca1509920ee-kube-api-access-s28dr\") pod \"openshift-config-operator-7777fb866f-wm59s\" (UID: \"53b661a1-d7da-45eb-9861-dca1509920ee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wm59s" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752249 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40ac0b95-eb22-4b20-8870-b440d5fa12d1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lm2zv\" (UID: \"40ac0b95-eb22-4b20-8870-b440d5fa12d1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lm2zv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752265 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/84955373-f1a6-473c-8b85-2f8d4dc29256-webhook-cert\") pod \"packageserver-d55dfcdfc-q7b76\" (UID: \"84955373-f1a6-473c-8b85-2f8d4dc29256\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b76" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752283 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e1420e8-ae3b-4ef3-98e3-f2b63eb032e1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rdqrx\" (UID: \"6e1420e8-ae3b-4ef3-98e3-f2b63eb032e1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rdqrx" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752302 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/259b7e23-88c9-452d-a549-a0ccbfacbcb5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-l5rcq\" (UID: \"259b7e23-88c9-452d-a549-a0ccbfacbcb5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l5rcq" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752320 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c846868-7bc7-4aca-b36c-b8e85cc31ac2-config-volume\") pod \"collect-profiles-29565555-5tnbp\" (UID: \"7c846868-7bc7-4aca-b36c-b8e85cc31ac2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565555-5tnbp" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752337 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a0231568-cb75-4c4a-be45-e07f0a03c320-socket-dir\") pod \"csi-hostpathplugin-bwgwn\" (UID: \"a0231568-cb75-4c4a-be45-e07f0a03c320\") " pod="hostpath-provisioner/csi-hostpathplugin-bwgwn" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752364 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f3dd2372-7717-4dd5-9812-60a628b02dda-srv-cert\") pod \"catalog-operator-68c6474976-54ctz\" (UID: \"f3dd2372-7717-4dd5-9812-60a628b02dda\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-54ctz" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752380 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5f9v\" (UniqueName: \"kubernetes.io/projected/e2f99f52-00ff-42f0-a2ee-122235c86b2b-kube-api-access-k5f9v\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752400 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2f99f52-00ff-42f0-a2ee-122235c86b2b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752416 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40ac0b95-eb22-4b20-8870-b440d5fa12d1-config\") pod \"kube-controller-manager-operator-78b949d7b-lm2zv\" (UID: \"40ac0b95-eb22-4b20-8870-b440d5fa12d1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lm2zv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752472 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c6abe54-14f8-429a-9833-9492e274ec41-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-tzhdc\" (UID: \"6c6abe54-14f8-429a-9833-9492e274ec41\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tzhdc" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752491 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnc8s\" (UniqueName: \"kubernetes.io/projected/47dd3e68-6697-43ba-b530-61447a1fe1e7-kube-api-access-lnc8s\") pod \"machine-config-server-wbmz4\" (UID: \"47dd3e68-6697-43ba-b530-61447a1fe1e7\") " pod="openshift-machine-config-operator/machine-config-server-wbmz4" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752508 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwbh7\" (UniqueName: \"kubernetes.io/projected/398ecde0-3269-4a96-a831-9762c0fd76b0-kube-api-access-bwbh7\") pod \"dns-default-hwlpq\" (UID: \"398ecde0-3269-4a96-a831-9762c0fd76b0\") " pod="openshift-dns/dns-default-hwlpq" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752536 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xlfm\" (UniqueName: \"kubernetes.io/projected/84955373-f1a6-473c-8b85-2f8d4dc29256-kube-api-access-9xlfm\") pod \"packageserver-d55dfcdfc-q7b76\" (UID: \"84955373-f1a6-473c-8b85-2f8d4dc29256\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b76" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752551 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a0231568-cb75-4c4a-be45-e07f0a03c320-mountpoint-dir\") pod \"csi-hostpathplugin-bwgwn\" (UID: \"a0231568-cb75-4c4a-be45-e07f0a03c320\") " pod="hostpath-provisioner/csi-hostpathplugin-bwgwn" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752571 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f6015a9-593b-4a7f-a16b-dc415bba5374-serving-cert\") pod \"service-ca-operator-777779d784-wrsj5\" (UID: \"0f6015a9-593b-4a7f-a16b-dc415bba5374\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wrsj5" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752601 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/47dd3e68-6697-43ba-b530-61447a1fe1e7-node-bootstrap-token\") pod \"machine-config-server-wbmz4\" (UID: \"47dd3e68-6697-43ba-b530-61447a1fe1e7\") " pod="openshift-machine-config-operator/machine-config-server-wbmz4" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752617 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a0231568-cb75-4c4a-be45-e07f0a03c320-csi-data-dir\") pod \"csi-hostpathplugin-bwgwn\" (UID: \"a0231568-cb75-4c4a-be45-e07f0a03c320\") " pod="hostpath-provisioner/csi-hostpathplugin-bwgwn" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752634 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f3dd2372-7717-4dd5-9812-60a628b02dda-profile-collector-cert\") pod \"catalog-operator-68c6474976-54ctz\" (UID: \"f3dd2372-7717-4dd5-9812-60a628b02dda\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-54ctz" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752650 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a0231568-cb75-4c4a-be45-e07f0a03c320-plugins-dir\") pod \"csi-hostpathplugin-bwgwn\" (UID: \"a0231568-cb75-4c4a-be45-e07f0a03c320\") " pod="hostpath-provisioner/csi-hostpathplugin-bwgwn" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752668 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8d55d3e9-4387-4456-814b-34317b8768f5-proxy-tls\") pod \"machine-config-controller-84d6567774-2vr9c\" (UID: \"8d55d3e9-4387-4456-814b-34317b8768f5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2vr9c" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752683 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40ac0b95-eb22-4b20-8870-b440d5fa12d1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lm2zv\" (UID: \"40ac0b95-eb22-4b20-8870-b440d5fa12d1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lm2zv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752701 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct48p\" (UniqueName: \"kubernetes.io/projected/98e7c37e-276a-4fae-aa3a-856f6c33e608-kube-api-access-ct48p\") pod \"multus-admission-controller-857f4d67dd-k96m2\" (UID: \"98e7c37e-276a-4fae-aa3a-856f6c33e608\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k96m2" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752727 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjb4j\" (UniqueName: \"kubernetes.io/projected/af3ce0f9-bc02-4142-8655-9751fe9197db-kube-api-access-vjb4j\") pod \"auto-csr-approver-29565558-wvlb8\" (UID: \"af3ce0f9-bc02-4142-8655-9751fe9197db\") " pod="openshift-infra/auto-csr-approver-29565558-wvlb8" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752743 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crx9f\" (UniqueName: \"kubernetes.io/projected/0f6015a9-593b-4a7f-a16b-dc415bba5374-kube-api-access-crx9f\") pod \"service-ca-operator-777779d784-wrsj5\" (UID: \"0f6015a9-593b-4a7f-a16b-dc415bba5374\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wrsj5" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752776 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53b661a1-d7da-45eb-9861-dca1509920ee-serving-cert\") pod \"openshift-config-operator-7777fb866f-wm59s\" (UID: \"53b661a1-d7da-45eb-9861-dca1509920ee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wm59s" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752791 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/53b661a1-d7da-45eb-9861-dca1509920ee-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wm59s\" (UID: \"53b661a1-d7da-45eb-9861-dca1509920ee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wm59s" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752825 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2f99f52-00ff-42f0-a2ee-122235c86b2b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752850 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9687355-25ac-44eb-ab53-25419b1c24b6-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zpxp6\" (UID: \"d9687355-25ac-44eb-ab53-25419b1c24b6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zpxp6" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752875 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98e7c37e-276a-4fae-aa3a-856f6c33e608-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-k96m2\" (UID: \"98e7c37e-276a-4fae-aa3a-856f6c33e608\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k96m2" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752892 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/259b7e23-88c9-452d-a549-a0ccbfacbcb5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-l5rcq\" (UID: \"259b7e23-88c9-452d-a549-a0ccbfacbcb5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l5rcq" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752907 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/398ecde0-3269-4a96-a831-9762c0fd76b0-config-volume\") pod \"dns-default-hwlpq\" (UID: \"398ecde0-3269-4a96-a831-9762c0fd76b0\") " pod="openshift-dns/dns-default-hwlpq" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752923 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d9687355-25ac-44eb-ab53-25419b1c24b6-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zpxp6\" (UID: \"d9687355-25ac-44eb-ab53-25419b1c24b6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zpxp6" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752940 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f6015a9-593b-4a7f-a16b-dc415bba5374-config\") pod \"service-ca-operator-777779d784-wrsj5\" (UID: \"0f6015a9-593b-4a7f-a16b-dc415bba5374\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wrsj5" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.752975 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt8wg\" (UniqueName: \"kubernetes.io/projected/d16bbb1e-ba55-4380-a503-e67f57dae69a-kube-api-access-zt8wg\") pod \"ingress-canary-ngxwv\" (UID: \"d16bbb1e-ba55-4380-a503-e67f57dae69a\") " pod="openshift-ingress-canary/ingress-canary-ngxwv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.753007 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dspn2\" (UniqueName: \"kubernetes.io/projected/6e1420e8-ae3b-4ef3-98e3-f2b63eb032e1-kube-api-access-dspn2\") pod \"package-server-manager-789f6589d5-rdqrx\" (UID: \"6e1420e8-ae3b-4ef3-98e3-f2b63eb032e1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rdqrx" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.753024 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/84955373-f1a6-473c-8b85-2f8d4dc29256-apiservice-cert\") pod \"packageserver-d55dfcdfc-q7b76\" (UID: \"84955373-f1a6-473c-8b85-2f8d4dc29256\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b76" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.753042 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/398ecde0-3269-4a96-a831-9762c0fd76b0-metrics-tls\") pod \"dns-default-hwlpq\" (UID: \"398ecde0-3269-4a96-a831-9762c0fd76b0\") " pod="openshift-dns/dns-default-hwlpq" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.753066 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/47dd3e68-6697-43ba-b530-61447a1fe1e7-certs\") pod \"machine-config-server-wbmz4\" (UID: \"47dd3e68-6697-43ba-b530-61447a1fe1e7\") " pod="openshift-machine-config-operator/machine-config-server-wbmz4" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.753082 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzgl5\" (UniqueName: \"kubernetes.io/projected/6c6abe54-14f8-429a-9833-9492e274ec41-kube-api-access-gzgl5\") pod \"control-plane-machine-set-operator-78cbb6b69f-tzhdc\" (UID: \"6c6abe54-14f8-429a-9833-9492e274ec41\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tzhdc" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.753105 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs69d\" (UniqueName: \"kubernetes.io/projected/a0231568-cb75-4c4a-be45-e07f0a03c320-kube-api-access-zs69d\") pod \"csi-hostpathplugin-bwgwn\" (UID: \"a0231568-cb75-4c4a-be45-e07f0a03c320\") " pod="hostpath-provisioner/csi-hostpathplugin-bwgwn" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.753121 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2f99f52-00ff-42f0-a2ee-122235c86b2b-registry-tls\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.753136 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2f99f52-00ff-42f0-a2ee-122235c86b2b-trusted-ca\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.753151 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/84955373-f1a6-473c-8b85-2f8d4dc29256-tmpfs\") pod \"packageserver-d55dfcdfc-q7b76\" (UID: \"84955373-f1a6-473c-8b85-2f8d4dc29256\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b76" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.753177 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22kpc\" (UniqueName: \"kubernetes.io/projected/8d55d3e9-4387-4456-814b-34317b8768f5-kube-api-access-22kpc\") pod \"machine-config-controller-84d6567774-2vr9c\" (UID: \"8d55d3e9-4387-4456-814b-34317b8768f5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2vr9c" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.753193 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c846868-7bc7-4aca-b36c-b8e85cc31ac2-secret-volume\") pod \"collect-profiles-29565555-5tnbp\" (UID: \"7c846868-7bc7-4aca-b36c-b8e85cc31ac2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565555-5tnbp" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.753210 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhh5g\" (UniqueName: \"kubernetes.io/projected/259b7e23-88c9-452d-a549-a0ccbfacbcb5-kube-api-access-dhh5g\") pod \"kube-storage-version-migrator-operator-b67b599dd-l5rcq\" (UID: \"259b7e23-88c9-452d-a549-a0ccbfacbcb5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l5rcq" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.753238 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dgpr\" (UniqueName: \"kubernetes.io/projected/7c846868-7bc7-4aca-b36c-b8e85cc31ac2-kube-api-access-9dgpr\") pod \"collect-profiles-29565555-5tnbp\" (UID: \"7c846868-7bc7-4aca-b36c-b8e85cc31ac2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565555-5tnbp" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.754751 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/53b661a1-d7da-45eb-9861-dca1509920ee-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wm59s\" (UID: \"53b661a1-d7da-45eb-9861-dca1509920ee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wm59s" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.756057 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/84955373-f1a6-473c-8b85-2f8d4dc29256-tmpfs\") pod \"packageserver-d55dfcdfc-q7b76\" (UID: \"84955373-f1a6-473c-8b85-2f8d4dc29256\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b76" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.756649 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40ac0b95-eb22-4b20-8870-b440d5fa12d1-config\") pod \"kube-controller-manager-operator-78b949d7b-lm2zv\" (UID: \"40ac0b95-eb22-4b20-8870-b440d5fa12d1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lm2zv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.759482 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53b661a1-d7da-45eb-9861-dca1509920ee-serving-cert\") pod \"openshift-config-operator-7777fb866f-wm59s\" (UID: \"53b661a1-d7da-45eb-9861-dca1509920ee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wm59s" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.759569 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c6abe54-14f8-429a-9833-9492e274ec41-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-tzhdc\" (UID: \"6c6abe54-14f8-429a-9833-9492e274ec41\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tzhdc" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.760188 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/84955373-f1a6-473c-8b85-2f8d4dc29256-webhook-cert\") pod \"packageserver-d55dfcdfc-q7b76\" (UID: \"84955373-f1a6-473c-8b85-2f8d4dc29256\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b76" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.761231 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f6015a9-593b-4a7f-a16b-dc415bba5374-config\") pod \"service-ca-operator-777779d784-wrsj5\" (UID: \"0f6015a9-593b-4a7f-a16b-dc415bba5374\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wrsj5" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.766415 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7fqvf" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.766458 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f3dd2372-7717-4dd5-9812-60a628b02dda-profile-collector-cert\") pod \"catalog-operator-68c6474976-54ctz\" (UID: \"f3dd2372-7717-4dd5-9812-60a628b02dda\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-54ctz" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.769276 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/259b7e23-88c9-452d-a549-a0ccbfacbcb5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-l5rcq\" (UID: \"259b7e23-88c9-452d-a549-a0ccbfacbcb5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l5rcq" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.769736 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2f99f52-00ff-42f0-a2ee-122235c86b2b-registry-certificates\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.770452 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c846868-7bc7-4aca-b36c-b8e85cc31ac2-config-volume\") pod \"collect-profiles-29565555-5tnbp\" (UID: \"7c846868-7bc7-4aca-b36c-b8e85cc31ac2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565555-5tnbp" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.770816 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2f99f52-00ff-42f0-a2ee-122235c86b2b-trusted-ca\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:45 crc kubenswrapper[4771]: E0319 15:19:45.771568 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:46.271548682 +0000 UTC m=+245.500169884 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.772708 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2f99f52-00ff-42f0-a2ee-122235c86b2b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.773366 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2f99f52-00ff-42f0-a2ee-122235c86b2b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.774336 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8d55d3e9-4387-4456-814b-34317b8768f5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2vr9c\" (UID: \"8d55d3e9-4387-4456-814b-34317b8768f5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2vr9c" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.774614 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-vm7fz" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.775093 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9687355-25ac-44eb-ab53-25419b1c24b6-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zpxp6\" (UID: \"d9687355-25ac-44eb-ab53-25419b1c24b6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zpxp6" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.775519 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/259b7e23-88c9-452d-a549-a0ccbfacbcb5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-l5rcq\" (UID: \"259b7e23-88c9-452d-a549-a0ccbfacbcb5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l5rcq" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.780365 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c846868-7bc7-4aca-b36c-b8e85cc31ac2-secret-volume\") pod \"collect-profiles-29565555-5tnbp\" (UID: \"7c846868-7bc7-4aca-b36c-b8e85cc31ac2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565555-5tnbp" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.781176 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9687355-25ac-44eb-ab53-25419b1c24b6-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zpxp6\" (UID: \"d9687355-25ac-44eb-ab53-25419b1c24b6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zpxp6" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.781685 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/47dd3e68-6697-43ba-b530-61447a1fe1e7-certs\") pod \"machine-config-server-wbmz4\" (UID: \"47dd3e68-6697-43ba-b530-61447a1fe1e7\") " pod="openshift-machine-config-operator/machine-config-server-wbmz4" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.782232 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/47dd3e68-6697-43ba-b530-61447a1fe1e7-node-bootstrap-token\") pod \"machine-config-server-wbmz4\" (UID: \"47dd3e68-6697-43ba-b530-61447a1fe1e7\") " pod="openshift-machine-config-operator/machine-config-server-wbmz4" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.782233 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8d55d3e9-4387-4456-814b-34317b8768f5-proxy-tls\") pod \"machine-config-controller-84d6567774-2vr9c\" (UID: \"8d55d3e9-4387-4456-814b-34317b8768f5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2vr9c" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.782579 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40ac0b95-eb22-4b20-8870-b440d5fa12d1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lm2zv\" (UID: \"40ac0b95-eb22-4b20-8870-b440d5fa12d1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lm2zv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.782583 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-2f87m"] Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.782647 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98e7c37e-276a-4fae-aa3a-856f6c33e608-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-k96m2\" (UID: \"98e7c37e-276a-4fae-aa3a-856f6c33e608\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k96m2" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.782768 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/84955373-f1a6-473c-8b85-2f8d4dc29256-apiservice-cert\") pod \"packageserver-d55dfcdfc-q7b76\" (UID: \"84955373-f1a6-473c-8b85-2f8d4dc29256\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b76" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.783241 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e1420e8-ae3b-4ef3-98e3-f2b63eb032e1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rdqrx\" (UID: \"6e1420e8-ae3b-4ef3-98e3-f2b63eb032e1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rdqrx" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.786264 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2f99f52-00ff-42f0-a2ee-122235c86b2b-registry-tls\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.789999 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f6015a9-593b-4a7f-a16b-dc415bba5374-serving-cert\") pod \"service-ca-operator-777779d784-wrsj5\" (UID: \"0f6015a9-593b-4a7f-a16b-dc415bba5374\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wrsj5" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.791019 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f3dd2372-7717-4dd5-9812-60a628b02dda-srv-cert\") pod \"catalog-operator-68c6474976-54ctz\" (UID: \"f3dd2372-7717-4dd5-9812-60a628b02dda\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-54ctz" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.799173 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d9687355-25ac-44eb-ab53-25419b1c24b6-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zpxp6\" (UID: \"d9687355-25ac-44eb-ab53-25419b1c24b6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zpxp6" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.819651 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zpxp6" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.824404 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct48p\" (UniqueName: \"kubernetes.io/projected/98e7c37e-276a-4fae-aa3a-856f6c33e608-kube-api-access-ct48p\") pod \"multus-admission-controller-857f4d67dd-k96m2\" (UID: \"98e7c37e-276a-4fae-aa3a-856f6c33e608\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k96m2" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.838742 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjb4j\" (UniqueName: \"kubernetes.io/projected/af3ce0f9-bc02-4142-8655-9751fe9197db-kube-api-access-vjb4j\") pod \"auto-csr-approver-29565558-wvlb8\" (UID: \"af3ce0f9-bc02-4142-8655-9751fe9197db\") " pod="openshift-infra/auto-csr-approver-29565558-wvlb8" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.856462 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a0231568-cb75-4c4a-be45-e07f0a03c320-registration-dir\") pod \"csi-hostpathplugin-bwgwn\" (UID: \"a0231568-cb75-4c4a-be45-e07f0a03c320\") " pod="hostpath-provisioner/csi-hostpathplugin-bwgwn" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.856887 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a0231568-cb75-4c4a-be45-e07f0a03c320-registration-dir\") pod \"csi-hostpathplugin-bwgwn\" (UID: \"a0231568-cb75-4c4a-be45-e07f0a03c320\") " pod="hostpath-provisioner/csi-hostpathplugin-bwgwn" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.856961 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a0231568-cb75-4c4a-be45-e07f0a03c320-socket-dir\") pod \"csi-hostpathplugin-bwgwn\" (UID: \"a0231568-cb75-4c4a-be45-e07f0a03c320\") " pod="hostpath-provisioner/csi-hostpathplugin-bwgwn" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.857041 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.857084 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a0231568-cb75-4c4a-be45-e07f0a03c320-socket-dir\") pod \"csi-hostpathplugin-bwgwn\" (UID: \"a0231568-cb75-4c4a-be45-e07f0a03c320\") " pod="hostpath-provisioner/csi-hostpathplugin-bwgwn" Mar 19 15:19:45 crc kubenswrapper[4771]: E0319 15:19:45.857499 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:46.357473381 +0000 UTC m=+245.586094583 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.857077 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwbh7\" (UniqueName: \"kubernetes.io/projected/398ecde0-3269-4a96-a831-9762c0fd76b0-kube-api-access-bwbh7\") pod \"dns-default-hwlpq\" (UID: \"398ecde0-3269-4a96-a831-9762c0fd76b0\") " pod="openshift-dns/dns-default-hwlpq" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.857592 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a0231568-cb75-4c4a-be45-e07f0a03c320-mountpoint-dir\") pod \"csi-hostpathplugin-bwgwn\" (UID: \"a0231568-cb75-4c4a-be45-e07f0a03c320\") " pod="hostpath-provisioner/csi-hostpathplugin-bwgwn" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.857629 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a0231568-cb75-4c4a-be45-e07f0a03c320-csi-data-dir\") pod \"csi-hostpathplugin-bwgwn\" (UID: \"a0231568-cb75-4c4a-be45-e07f0a03c320\") " pod="hostpath-provisioner/csi-hostpathplugin-bwgwn" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.857661 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a0231568-cb75-4c4a-be45-e07f0a03c320-plugins-dir\") pod \"csi-hostpathplugin-bwgwn\" (UID: \"a0231568-cb75-4c4a-be45-e07f0a03c320\") " pod="hostpath-provisioner/csi-hostpathplugin-bwgwn" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.857756 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/398ecde0-3269-4a96-a831-9762c0fd76b0-config-volume\") pod \"dns-default-hwlpq\" (UID: \"398ecde0-3269-4a96-a831-9762c0fd76b0\") " pod="openshift-dns/dns-default-hwlpq" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.857794 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt8wg\" (UniqueName: \"kubernetes.io/projected/d16bbb1e-ba55-4380-a503-e67f57dae69a-kube-api-access-zt8wg\") pod \"ingress-canary-ngxwv\" (UID: \"d16bbb1e-ba55-4380-a503-e67f57dae69a\") " pod="openshift-ingress-canary/ingress-canary-ngxwv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.857835 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/398ecde0-3269-4a96-a831-9762c0fd76b0-metrics-tls\") pod \"dns-default-hwlpq\" (UID: \"398ecde0-3269-4a96-a831-9762c0fd76b0\") " pod="openshift-dns/dns-default-hwlpq" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.857875 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs69d\" (UniqueName: \"kubernetes.io/projected/a0231568-cb75-4c4a-be45-e07f0a03c320-kube-api-access-zs69d\") pod \"csi-hostpathplugin-bwgwn\" (UID: \"a0231568-cb75-4c4a-be45-e07f0a03c320\") " pod="hostpath-provisioner/csi-hostpathplugin-bwgwn" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.857938 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a0231568-cb75-4c4a-be45-e07f0a03c320-plugins-dir\") pod \"csi-hostpathplugin-bwgwn\" (UID: \"a0231568-cb75-4c4a-be45-e07f0a03c320\") " pod="hostpath-provisioner/csi-hostpathplugin-bwgwn" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.857951 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d16bbb1e-ba55-4380-a503-e67f57dae69a-cert\") pod \"ingress-canary-ngxwv\" (UID: \"d16bbb1e-ba55-4380-a503-e67f57dae69a\") " pod="openshift-ingress-canary/ingress-canary-ngxwv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.858204 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a0231568-cb75-4c4a-be45-e07f0a03c320-mountpoint-dir\") pod \"csi-hostpathplugin-bwgwn\" (UID: \"a0231568-cb75-4c4a-be45-e07f0a03c320\") " pod="hostpath-provisioner/csi-hostpathplugin-bwgwn" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.858297 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a0231568-cb75-4c4a-be45-e07f0a03c320-csi-data-dir\") pod \"csi-hostpathplugin-bwgwn\" (UID: \"a0231568-cb75-4c4a-be45-e07f0a03c320\") " pod="hostpath-provisioner/csi-hostpathplugin-bwgwn" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.859190 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/398ecde0-3269-4a96-a831-9762c0fd76b0-config-volume\") pod \"dns-default-hwlpq\" (UID: \"398ecde0-3269-4a96-a831-9762c0fd76b0\") " pod="openshift-dns/dns-default-hwlpq" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.864592 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d16bbb1e-ba55-4380-a503-e67f57dae69a-cert\") pod \"ingress-canary-ngxwv\" (UID: \"d16bbb1e-ba55-4380-a503-e67f57dae69a\") " pod="openshift-ingress-canary/ingress-canary-ngxwv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.864770 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/398ecde0-3269-4a96-a831-9762c0fd76b0-metrics-tls\") pod \"dns-default-hwlpq\" (UID: \"398ecde0-3269-4a96-a831-9762c0fd76b0\") " pod="openshift-dns/dns-default-hwlpq" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.864835 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzgl5\" (UniqueName: \"kubernetes.io/projected/6c6abe54-14f8-429a-9833-9492e274ec41-kube-api-access-gzgl5\") pod \"control-plane-machine-set-operator-78cbb6b69f-tzhdc\" (UID: \"6c6abe54-14f8-429a-9833-9492e274ec41\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tzhdc" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.873502 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qj9qq"] Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.877671 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-m25b8"] Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.883940 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crx9f\" (UniqueName: \"kubernetes.io/projected/0f6015a9-593b-4a7f-a16b-dc415bba5374-kube-api-access-crx9f\") pod \"service-ca-operator-777779d784-wrsj5\" (UID: \"0f6015a9-593b-4a7f-a16b-dc415bba5374\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wrsj5" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.899068 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565558-wvlb8" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.939406 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dspn2\" (UniqueName: \"kubernetes.io/projected/6e1420e8-ae3b-4ef3-98e3-f2b63eb032e1-kube-api-access-dspn2\") pod \"package-server-manager-789f6589d5-rdqrx\" (UID: \"6e1420e8-ae3b-4ef3-98e3-f2b63eb032e1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rdqrx" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.952797 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnc8s\" (UniqueName: \"kubernetes.io/projected/47dd3e68-6697-43ba-b530-61447a1fe1e7-kube-api-access-lnc8s\") pod \"machine-config-server-wbmz4\" (UID: \"47dd3e68-6697-43ba-b530-61447a1fe1e7\") " pod="openshift-machine-config-operator/machine-config-server-wbmz4" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.954096 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhh5g\" (UniqueName: \"kubernetes.io/projected/259b7e23-88c9-452d-a549-a0ccbfacbcb5-kube-api-access-dhh5g\") pod \"kube-storage-version-migrator-operator-b67b599dd-l5rcq\" (UID: \"259b7e23-88c9-452d-a549-a0ccbfacbcb5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l5rcq" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.962487 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:45 crc kubenswrapper[4771]: E0319 15:19:45.965107 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:46.465073998 +0000 UTC m=+245.693695200 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.973064 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40ac0b95-eb22-4b20-8870-b440d5fa12d1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lm2zv\" (UID: \"40ac0b95-eb22-4b20-8870-b440d5fa12d1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lm2zv" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.985856 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dgpr\" (UniqueName: \"kubernetes.io/projected/7c846868-7bc7-4aca-b36c-b8e85cc31ac2-kube-api-access-9dgpr\") pod \"collect-profiles-29565555-5tnbp\" (UID: \"7c846868-7bc7-4aca-b36c-b8e85cc31ac2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565555-5tnbp" Mar 19 15:19:45 crc kubenswrapper[4771]: I0319 15:19:45.994297 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l5rcq" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.013076 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xlfm\" (UniqueName: \"kubernetes.io/projected/84955373-f1a6-473c-8b85-2f8d4dc29256-kube-api-access-9xlfm\") pod \"packageserver-d55dfcdfc-q7b76\" (UID: \"84955373-f1a6-473c-8b85-2f8d4dc29256\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b76" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.016603 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lm2zv" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.031372 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s28dr\" (UniqueName: \"kubernetes.io/projected/53b661a1-d7da-45eb-9861-dca1509920ee-kube-api-access-s28dr\") pod \"openshift-config-operator-7777fb866f-wm59s\" (UID: \"53b661a1-d7da-45eb-9861-dca1509920ee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wm59s" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.042663 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rdqrx" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.054400 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp4n5\" (UniqueName: \"kubernetes.io/projected/f3dd2372-7717-4dd5-9812-60a628b02dda-kube-api-access-rp4n5\") pod \"catalog-operator-68c6474976-54ctz\" (UID: \"f3dd2372-7717-4dd5-9812-60a628b02dda\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-54ctz" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.064653 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.064831 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5f9v\" (UniqueName: \"kubernetes.io/projected/e2f99f52-00ff-42f0-a2ee-122235c86b2b-kube-api-access-k5f9v\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:46 crc kubenswrapper[4771]: E0319 15:19:46.065172 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:46.565155616 +0000 UTC m=+245.793776818 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.079559 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-k96m2" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.087267 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2f99f52-00ff-42f0-a2ee-122235c86b2b-bound-sa-token\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.103063 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b76" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.103459 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tzhdc" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.111119 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22kpc\" (UniqueName: \"kubernetes.io/projected/8d55d3e9-4387-4456-814b-34317b8768f5-kube-api-access-22kpc\") pod \"machine-config-controller-84d6567774-2vr9c\" (UID: \"8d55d3e9-4387-4456-814b-34317b8768f5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2vr9c" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.121570 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fk4xm"] Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.126389 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565555-5tnbp" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.146021 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wrsj5" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.153573 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-54ctz" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.167852 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:46 crc kubenswrapper[4771]: E0319 15:19:46.168281 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:46.668259341 +0000 UTC m=+245.896880543 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.171682 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs69d\" (UniqueName: \"kubernetes.io/projected/a0231568-cb75-4c4a-be45-e07f0a03c320-kube-api-access-zs69d\") pod \"csi-hostpathplugin-bwgwn\" (UID: \"a0231568-cb75-4c4a-be45-e07f0a03c320\") " pod="hostpath-provisioner/csi-hostpathplugin-bwgwn" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.178398 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt8wg\" (UniqueName: \"kubernetes.io/projected/d16bbb1e-ba55-4380-a503-e67f57dae69a-kube-api-access-zt8wg\") pod \"ingress-canary-ngxwv\" (UID: \"d16bbb1e-ba55-4380-a503-e67f57dae69a\") " pod="openshift-ingress-canary/ingress-canary-ngxwv" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.191153 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-sl682" event={"ID":"91fbe793-06cf-41b4-b24b-a57657dc05f2","Type":"ContainerStarted","Data":"c499e07a767a4fa1d9a7530d4828cdf1ab09daedd212ed653f2ed4ec392464fd"} Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.191199 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-sl682" event={"ID":"91fbe793-06cf-41b4-b24b-a57657dc05f2","Type":"ContainerStarted","Data":"c2c09af186a62233a1fcb7f0f36f435065ced00a01b4d078189c070e00ba1706"} Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.196338 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwbh7\" (UniqueName: \"kubernetes.io/projected/398ecde0-3269-4a96-a831-9762c0fd76b0-kube-api-access-bwbh7\") pod \"dns-default-hwlpq\" (UID: \"398ecde0-3269-4a96-a831-9762c0fd76b0\") " pod="openshift-dns/dns-default-hwlpq" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.201386 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8827r"] Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.211838 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lxmpv"] Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.212123 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-h97xq" event={"ID":"6dc754c0-8f17-402b-9bd4-be033eb940ba","Type":"ContainerStarted","Data":"5ce486487bcb8cdb2522de50d8410760e3f65be7ffe691bb5c05dfbba5142e72"} Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.212160 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-h97xq" event={"ID":"6dc754c0-8f17-402b-9bd4-be033eb940ba","Type":"ContainerStarted","Data":"897383fea6d9e7358093ca896f88dc9dcba5008acfd0c691f40579af3bb59057"} Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.214847 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-wbmz4" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.216883 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-5jvq8" event={"ID":"0c7b96e7-c74b-4608-a81f-92a7d977c7d9","Type":"ContainerStarted","Data":"3fdd3eaba3441f40d034f252e8832070fa5aeb905c3fce7d9d4fd9d142bc26fc"} Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.223254 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2f87m" event={"ID":"adf88882-bc11-4ee1-a2ba-cfd13d62b8dd","Type":"ContainerStarted","Data":"316ec7c6a8e07fa1a0505315e57863d608c584751a8db8244da25d99e31e4e1b"} Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.223940 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ngxwv" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.224612 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qvlnd"] Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.231475 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-m25b8" event={"ID":"6f5dca39-298b-4814-b77c-43dd0cbc4025","Type":"ContainerStarted","Data":"a309876861145a52e779414e27d23e5f660bb8afeb09910d0cd021b8c11e16f7"} Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.234305 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-bwgwn" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.234938 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" event={"ID":"2db36f46-e19a-4b7d-a94f-157f65671639","Type":"ContainerStarted","Data":"bd4d929b59f803fe666e4d85d49f72109ba2f8a6c8da095f0783f57978d76d10"} Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.239053 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-2mx6f" event={"ID":"f5ae0ccc-a50b-46d1-b887-28840703ab87","Type":"ContainerStarted","Data":"199fb7d96c8a2f7cb2d56fcaed4bc9fd30ec226821c0733ca4c8a1dfd192ee66"} Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.251101 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wwpj8" event={"ID":"8169e27a-4573-4d03-b3e8-0c072a2efbe7","Type":"ContainerStarted","Data":"e359579184c343707f070d7246520011c85c56b44f3c8a89fc3686c99350f44f"} Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.251164 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wwpj8" event={"ID":"8169e27a-4573-4d03-b3e8-0c072a2efbe7","Type":"ContainerStarted","Data":"f12791db17e8a0006356e20f4c0bdcf559e03e36345f8c879fa996833b8c6662"} Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.252898 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-wwpj8" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.254854 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hwlpq" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.273159 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:46 crc kubenswrapper[4771]: E0319 15:19:46.280978 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:46.780949547 +0000 UTC m=+246.009570869 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.292620 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fs46x" event={"ID":"db1585ab-0970-4bd5-acba-7eda8ed2d40f","Type":"ContainerStarted","Data":"80171c42f6c2ee66cf586c7e7140bd5236541dccf183382f86a9d3ebfbb3c285"} Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.307674 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wm59s" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.309112 4771 patch_prober.go:28] interesting pod/console-operator-58897d9998-wwpj8 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.309208 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-wwpj8" podUID="8169e27a-4573-4d03-b3e8-0c072a2efbe7" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.313216 4771 generic.go:334] "Generic (PLEG): container finished" podID="b11452b4-794b-41a4-a700-0b541916c6ad" containerID="6e7ac7d0550610cc9d50632470b95b4225bfdf2ec4d817e2a8df98479a127ed7" exitCode=0 Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.313285 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" event={"ID":"b11452b4-794b-41a4-a700-0b541916c6ad","Type":"ContainerDied","Data":"6e7ac7d0550610cc9d50632470b95b4225bfdf2ec4d817e2a8df98479a127ed7"} Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.313317 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" event={"ID":"b11452b4-794b-41a4-a700-0b541916c6ad","Type":"ContainerStarted","Data":"db6f63f74460425aa922f0bde437527b52bfebb679fba64f1954bc3750b14bc2"} Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.364290 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jpwl9" event={"ID":"b01a56e9-ee30-4945-b582-5ff927104c4c","Type":"ContainerStarted","Data":"663ab16876d65e71b90cf6f42359db193a0d1ec312fa275d8ef91366cea8078e"} Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.364341 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jpwl9" event={"ID":"b01a56e9-ee30-4945-b582-5ff927104c4c","Type":"ContainerStarted","Data":"8f2f0eaf83d2fe1eb66fb2f852ce2a9273643fe22ab57c87f1e36115962174db"} Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.365055 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-jpwl9" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.372277 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4" event={"ID":"93121b3f-c7da-4e21-8fa0-7f8bf08a05ef","Type":"ContainerStarted","Data":"c8d8850098f2ad1d5d38dff8043e6437ffd39fb019ea824056a52baba62ecd4f"} Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.373031 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.376471 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:46 crc kubenswrapper[4771]: E0319 15:19:46.378157 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:46.878138901 +0000 UTC m=+246.106760103 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.390165 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9sf6z"] Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.403587 4771 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-hmvx4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.403716 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4" podUID="93121b3f-c7da-4e21-8fa0-7f8bf08a05ef" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.403935 4771 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-jpwl9 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.403957 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-jpwl9" podUID="b01a56e9-ee30-4945-b582-5ff927104c4c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.407523 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2vr9c" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.411480 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjh9n" event={"ID":"d94aad87-f9aa-4c6b-9846-509b3c6ad6b5","Type":"ContainerStarted","Data":"1e8d77403588cf6a65877027acda1052baad8cad528f0f51a8af7ea87e305299"} Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.411525 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjh9n" event={"ID":"d94aad87-f9aa-4c6b-9846-509b3c6ad6b5","Type":"ContainerStarted","Data":"f02736447583caa43433ee37f3fbd85d6cc9f26ac16d296e3a5065fc7914aef1"} Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.430843 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwdj8" event={"ID":"77444ace-be14-4606-898d-565c52bec7b0","Type":"ContainerStarted","Data":"319b632884e8b6e45e72bfbf059927d3a42e818ffbdb2c03a210671646525faa"} Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.430894 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwdj8" event={"ID":"77444ace-be14-4606-898d-565c52bec7b0","Type":"ContainerStarted","Data":"5af54543290525a122b93da00cd070fef81d95ae55c0890d0ed18464dda24088"} Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.432523 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gvngw" event={"ID":"d43df1e2-591a-43e2-a7e2-f48459125711","Type":"ContainerStarted","Data":"f5b6053598c28ee322cd2dedc8ab39f0202c014a152cb153db484036e93bc163"} Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.437518 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" event={"ID":"1c75a371-7547-47b8-ac59-d296d642cd5c","Type":"ContainerStarted","Data":"a5d15c0479c56f497dd7318ed842f2897c05f3658a422ed465ace0d279fcd0d2"} Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.481031 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:46 crc kubenswrapper[4771]: E0319 15:19:46.483292 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:46.983277638 +0000 UTC m=+246.211898840 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.585096 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:46 crc kubenswrapper[4771]: E0319 15:19:46.586207 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:47.086185176 +0000 UTC m=+246.314806378 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.676931 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-fvhqg"] Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.690460 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:46 crc kubenswrapper[4771]: E0319 15:19:46.690774 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:47.190761798 +0000 UTC m=+246.419383000 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.730395 4771 ???:1] "http: TLS handshake error from 192.168.126.11:46604: no serving certificate available for the kubelet" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.739121 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-5jvq8" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.756935 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vm7fz"] Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.766647 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565558-wvlb8"] Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.773131 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7fqvf"] Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.802866 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:46 crc kubenswrapper[4771]: E0319 15:19:46.803181 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:47.303132677 +0000 UTC m=+246.531753879 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.803672 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:46 crc kubenswrapper[4771]: E0319 15:19:46.804094 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:47.304086681 +0000 UTC m=+246.532707883 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.845489 4771 ???:1] "http: TLS handshake error from 192.168.126.11:46610: no serving certificate available for the kubelet" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.850130 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zpxp6"] Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.858009 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vdfcr"] Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.866601 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-jpwl9" podStartSLOduration=186.866576655 podStartE2EDuration="3m6.866576655s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:46.863297312 +0000 UTC m=+246.091918514" watchObservedRunningTime="2026-03-19 15:19:46.866576655 +0000 UTC m=+246.095197857" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.906535 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:46 crc kubenswrapper[4771]: E0319 15:19:46.906924 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:47.406905498 +0000 UTC m=+246.635526700 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.929517 4771 ???:1] "http: TLS handshake error from 192.168.126.11:46624: no serving certificate available for the kubelet" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.988104 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-h97xq" podStartSLOduration=186.988086536 podStartE2EDuration="3m6.988086536s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:46.985763147 +0000 UTC m=+246.214384349" watchObservedRunningTime="2026-03-19 15:19:46.988086536 +0000 UTC m=+246.216707738" Mar 19 15:19:46 crc kubenswrapper[4771]: I0319 15:19:46.988491 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gvngw" podStartSLOduration=187.988486286 podStartE2EDuration="3m7.988486286s" podCreationTimestamp="2026-03-19 15:16:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:46.953664784 +0000 UTC m=+246.182285986" watchObservedRunningTime="2026-03-19 15:19:46.988486286 +0000 UTC m=+246.217107488" Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.008486 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:47 crc kubenswrapper[4771]: E0319 15:19:47.008951 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:47.508930905 +0000 UTC m=+246.737552107 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.029913 4771 ???:1] "http: TLS handshake error from 192.168.126.11:46636: no serving certificate available for the kubelet" Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.083482 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.137764 4771 patch_prober.go:28] interesting pod/router-default-5444994796-5jvq8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 15:19:47 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Mar 19 15:19:47 crc kubenswrapper[4771]: [+]process-running ok Mar 19 15:19:47 crc kubenswrapper[4771]: healthz check failed Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.137829 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5jvq8" podUID="0c7b96e7-c74b-4608-a81f-92a7d977c7d9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.141357 4771 ???:1] "http: TLS handshake error from 192.168.126.11:46652: no serving certificate available for the kubelet" Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.158847 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:47 crc kubenswrapper[4771]: E0319 15:19:47.162438 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:47.662404196 +0000 UTC m=+246.891025388 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.178952 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:47 crc kubenswrapper[4771]: E0319 15:19:47.179818 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:47.679800057 +0000 UTC m=+246.908421259 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.279434 4771 ???:1] "http: TLS handshake error from 192.168.126.11:46658: no serving certificate available for the kubelet" Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.282054 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:47 crc kubenswrapper[4771]: E0319 15:19:47.282555 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:47.782535671 +0000 UTC m=+247.011156873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.383418 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:47 crc kubenswrapper[4771]: E0319 15:19:47.383822 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:47.883810239 +0000 UTC m=+247.112431441 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.455404 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wrsj5"] Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.485752 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:47 crc kubenswrapper[4771]: E0319 15:19:47.486282 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:47.986266537 +0000 UTC m=+247.214887729 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.495357 4771 ???:1] "http: TLS handshake error from 192.168.126.11:46666: no serving certificate available for the kubelet" Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.566558 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4" podStartSLOduration=187.566541382 podStartE2EDuration="3m7.566541382s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:47.56446791 +0000 UTC m=+246.793089102" watchObservedRunningTime="2026-03-19 15:19:47.566541382 +0000 UTC m=+246.795162584" Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.580513 4771 generic.go:334] "Generic (PLEG): container finished" podID="1c75a371-7547-47b8-ac59-d296d642cd5c" containerID="f87c9010b5400087feb1291d3cbb97c6f8d1bbd0211c486ae28bfa3ea0d76f90" exitCode=0 Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.590663 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:47 crc kubenswrapper[4771]: E0319 15:19:47.594269 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:48.094252665 +0000 UTC m=+247.322873867 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.606059 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9sf6z" event={"ID":"c09ae4bc-4073-442f-8cbc-0f42ea00a35d","Type":"ContainerStarted","Data":"0d18037413b5f5de693544a69e2fb51939053bc9826c1cfc8741003fc4b4e03b"} Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.606100 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lxmpv" event={"ID":"f790bc61-0a88-45de-b672-76b356fb8522","Type":"ContainerStarted","Data":"f8f433bfc798469764d13f01243b1dd11f84eeef53761c8b4a6c678321d3c821"} Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.606110 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565558-wvlb8" event={"ID":"af3ce0f9-bc02-4142-8655-9751fe9197db","Type":"ContainerStarted","Data":"a18a109f93aac2a6f097c6a886b1af52bad7df99857ce81938a3b494936e06f9"} Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.606124 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" event={"ID":"1c75a371-7547-47b8-ac59-d296d642cd5c","Type":"ContainerDied","Data":"f87c9010b5400087feb1291d3cbb97c6f8d1bbd0211c486ae28bfa3ea0d76f90"} Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.606137 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-5jvq8" event={"ID":"0c7b96e7-c74b-4608-a81f-92a7d977c7d9","Type":"ContainerStarted","Data":"d04d99dd1cb2f318a7485389a8b73a0e2897fa320d227296e9cc98f380518d3e"} Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.613542 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2f87m" event={"ID":"adf88882-bc11-4ee1-a2ba-cfd13d62b8dd","Type":"ContainerStarted","Data":"08d737fcf783d29e0a8b41d96f2e9f3753a75ba3a136f845e30f0518c7e186de"} Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.628291 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-m25b8" event={"ID":"6f5dca39-298b-4814-b77c-43dd0cbc4025","Type":"ContainerStarted","Data":"ccc4068b067deef256d6a5ce8a794000f45f09ce9a482e47ac7a58089d4a16b1"} Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.628957 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-m25b8" Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.629945 4771 patch_prober.go:28] interesting pod/downloads-7954f5f757-m25b8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.630007 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-m25b8" podUID="6f5dca39-298b-4814-b77c-43dd0cbc4025" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.633302 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fk4xm" event={"ID":"5847be39-eb3d-4802-8f96-771f91078979","Type":"ContainerStarted","Data":"e5994a32ee348d58263f7ce21a6d5ed06124fb9a7973d2ad63387c68f30c0435"} Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.666158 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-2mx6f" event={"ID":"f5ae0ccc-a50b-46d1-b887-28840703ab87","Type":"ContainerStarted","Data":"0569095e221ac8df7163011e4326d3aaae4d919f9fae03227b3225cd20fe9fb3"} Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.666220 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-2mx6f" event={"ID":"f5ae0ccc-a50b-46d1-b887-28840703ab87","Type":"ContainerStarted","Data":"a88ea6d9d4a920c00c669e2af30fd093dccd77faaa9d2544cd5be4fb86a8e11b"} Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.692104 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zpxp6" event={"ID":"d9687355-25ac-44eb-ab53-25419b1c24b6","Type":"ContainerStarted","Data":"79c014000a287cdabb4a2355ae99401b0f0145a34616efd8aa8625b287fc0ece"} Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.692291 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:47 crc kubenswrapper[4771]: E0319 15:19:47.693304 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:48.193288226 +0000 UTC m=+247.421909428 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.714769 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-vm7fz" event={"ID":"31c7ed96-b981-4f85-9f5a-dee62216ecd9","Type":"ContainerStarted","Data":"325fb68c1f03fe26d98ef5969e9b700172cd4d09c22c6a7f09822c1e4d9c8ae8"} Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.724225 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjh9n" podStartSLOduration=188.72420682 podStartE2EDuration="3m8.72420682s" podCreationTimestamp="2026-03-19 15:16:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:47.719609183 +0000 UTC m=+246.948230375" watchObservedRunningTime="2026-03-19 15:19:47.72420682 +0000 UTC m=+246.952828022" Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.744834 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gvngw" event={"ID":"d43df1e2-591a-43e2-a7e2-f48459125711","Type":"ContainerStarted","Data":"77d212206ba4ff5f675797c53212adbb3582be891f837ed619b48772e09e7f09"} Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.751160 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwdj8" podStartSLOduration=187.751137422 podStartE2EDuration="3m7.751137422s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:47.747426858 +0000 UTC m=+246.976048060" watchObservedRunningTime="2026-03-19 15:19:47.751137422 +0000 UTC m=+246.979758634" Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.776363 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jpwl9"] Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.784588 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-5jvq8" podStartSLOduration=187.7845697 podStartE2EDuration="3m7.7845697s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:47.781462512 +0000 UTC m=+247.010083714" watchObservedRunningTime="2026-03-19 15:19:47.7845697 +0000 UTC m=+247.013190902" Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.793568 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:47 crc kubenswrapper[4771]: E0319 15:19:47.795507 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:48.295489527 +0000 UTC m=+247.524110729 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.799950 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7fqvf" event={"ID":"b4eb7061-dde4-44f1-943a-219d2f4f5071","Type":"ContainerStarted","Data":"6062d69dc80ba1a7481eed0919c864d812434d26b533306e215bf893c3aa329c"} Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.820388 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-wbmz4" event={"ID":"47dd3e68-6697-43ba-b530-61447a1fe1e7","Type":"ContainerStarted","Data":"7fec7ec9a990f9ea59eb5749ca9a275fa6aebb0490ab7a078b9aab52ec6859dc"} Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.827209 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4"] Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.846728 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4" event={"ID":"93121b3f-c7da-4e21-8fa0-7f8bf08a05ef","Type":"ContainerStarted","Data":"977e9b50246eb18a00dfeb032ae7a306e14e8b9f8e23b71bb3d8cb312eb9f199"} Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.865179 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-fvhqg" event={"ID":"8796bcce-0185-4494-8485-66476bdde45c","Type":"ContainerStarted","Data":"0438e868de8ddf4f4a40788c3b30baa5c283ef1874ff6cd6a5514c1bd866c365"} Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.867885 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qvlnd" event={"ID":"b56fbd59-432d-4672-bc27-0cb80aa81405","Type":"ContainerStarted","Data":"1bbaa0d11dcfc7d52b791747ba496d124ddc5284c33007fef74cc823c1971f44"} Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.869783 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" event={"ID":"2db36f46-e19a-4b7d-a94f-157f65671639","Type":"ContainerStarted","Data":"5eb5780eeaca6f8a841a5a4e07e72f7d687407a6a99c543e7f32501aea385d86"} Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.882078 4771 ???:1] "http: TLS handshake error from 192.168.126.11:46676: no serving certificate available for the kubelet" Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.882379 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.884861 4771 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-qj9qq container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.31:6443/healthz\": dial tcp 10.217.0.31:6443: connect: connection refused" start-of-body= Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.884905 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" podUID="2db36f46-e19a-4b7d-a94f-157f65671639" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.31:6443/healthz\": dial tcp 10.217.0.31:6443: connect: connection refused" Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.892136 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fs46x" event={"ID":"db1585ab-0970-4bd5-acba-7eda8ed2d40f","Type":"ContainerStarted","Data":"26dde5c03563da54f74a24600318980f2c700425c3ee1d82c1fbae69f5f6529f"} Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.896149 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:47 crc kubenswrapper[4771]: E0319 15:19:47.897090 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:48.397075692 +0000 UTC m=+247.625696894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.907114 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-8827r" event={"ID":"bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf","Type":"ContainerStarted","Data":"33132cd0dc4d8c5e7fcfacd886193b40c5f7ef57eba05f61f35190297082a35e"} Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.919018 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vdfcr" event={"ID":"a212fe68-9219-479d-bf36-26b08daf31ab","Type":"ContainerStarted","Data":"1ca813beb55143a99cbd03493effc8e0c2283272f370148a3e14cac189afe0bc"} Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.927308 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-jpwl9" Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.930161 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-wwpj8" podStartSLOduration=188.930149251 podStartE2EDuration="3m8.930149251s" podCreationTimestamp="2026-03-19 15:16:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:47.895566204 +0000 UTC m=+247.124187416" watchObservedRunningTime="2026-03-19 15:19:47.930149251 +0000 UTC m=+247.158770453" Mar 19 15:19:47 crc kubenswrapper[4771]: I0319 15:19:47.979414 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-sl682" podStartSLOduration=188.979394249 podStartE2EDuration="3m8.979394249s" podCreationTimestamp="2026-03-19 15:16:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:47.938662417 +0000 UTC m=+247.167283619" watchObservedRunningTime="2026-03-19 15:19:47.979394249 +0000 UTC m=+247.208015451" Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.001177 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:48 crc kubenswrapper[4771]: E0319 15:19:48.001658 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:48.501646284 +0000 UTC m=+247.730267486 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.050720 4771 patch_prober.go:28] interesting pod/router-default-5444994796-5jvq8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 15:19:48 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Mar 19 15:19:48 crc kubenswrapper[4771]: [+]process-running ok Mar 19 15:19:48 crc kubenswrapper[4771]: healthz check failed Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.050786 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5jvq8" podUID="0c7b96e7-c74b-4608-a81f-92a7d977c7d9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.115555 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:48 crc kubenswrapper[4771]: E0319 15:19:48.118039 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:48.618016824 +0000 UTC m=+247.846638026 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.166299 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4" Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.203566 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-wbmz4" podStartSLOduration=6.203531732 podStartE2EDuration="6.203531732s" podCreationTimestamp="2026-03-19 15:19:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:48.139336924 +0000 UTC m=+247.367958126" watchObservedRunningTime="2026-03-19 15:19:48.203531732 +0000 UTC m=+247.432152934" Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.204342 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fs46x" podStartSLOduration=189.204336533 podStartE2EDuration="3m9.204336533s" podCreationTimestamp="2026-03-19 15:16:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:48.198976187 +0000 UTC m=+247.427597389" watchObservedRunningTime="2026-03-19 15:19:48.204336533 +0000 UTC m=+247.432957725" Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.218707 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:48 crc kubenswrapper[4771]: E0319 15:19:48.219469 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:48.719439896 +0000 UTC m=+247.948061088 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.247974 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-m25b8" podStartSLOduration=188.247959559 podStartE2EDuration="3m8.247959559s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:48.24485007 +0000 UTC m=+247.473471272" watchObservedRunningTime="2026-03-19 15:19:48.247959559 +0000 UTC m=+247.476580761" Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.248124 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rdqrx"] Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.319047 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" podStartSLOduration=189.319026451 podStartE2EDuration="3m9.319026451s" podCreationTimestamp="2026-03-19 15:16:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:48.271412933 +0000 UTC m=+247.500034135" watchObservedRunningTime="2026-03-19 15:19:48.319026451 +0000 UTC m=+247.547647653" Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.321266 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:48 crc kubenswrapper[4771]: E0319 15:19:48.321599 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:48.821584805 +0000 UTC m=+248.050205997 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.327724 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-8827r" podStartSLOduration=188.32770706 podStartE2EDuration="3m8.32770706s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:48.312278899 +0000 UTC m=+247.540900101" watchObservedRunningTime="2026-03-19 15:19:48.32770706 +0000 UTC m=+247.556328262" Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.331907 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tzhdc"] Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.354924 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-2mx6f" podStartSLOduration=188.35490563 podStartE2EDuration="3m8.35490563s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:48.35252976 +0000 UTC m=+247.581150962" watchObservedRunningTime="2026-03-19 15:19:48.35490563 +0000 UTC m=+247.583526832" Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.364623 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l5rcq"] Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.373588 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565555-5tnbp"] Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.413569 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-k96m2"] Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.423421 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:48 crc kubenswrapper[4771]: E0319 15:19:48.423946 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:48.92391882 +0000 UTC m=+248.152540022 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.515501 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ngxwv"] Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.530091 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:48 crc kubenswrapper[4771]: E0319 15:19:48.530491 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:49.030457611 +0000 UTC m=+248.259078813 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.531126 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:48 crc kubenswrapper[4771]: E0319 15:19:48.531454 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:49.031446486 +0000 UTC m=+248.260067688 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.564115 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lm2zv"] Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.566713 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hwlpq"] Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.590551 4771 ???:1] "http: TLS handshake error from 192.168.126.11:46678: no serving certificate available for the kubelet" Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.610183 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b76"] Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.632018 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wm59s"] Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.632409 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:48 crc kubenswrapper[4771]: E0319 15:19:48.632857 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:49.132839207 +0000 UTC m=+248.361460409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.688945 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-54ctz"] Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.734084 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:48 crc kubenswrapper[4771]: E0319 15:19:48.734523 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:49.234502634 +0000 UTC m=+248.463123836 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.744581 4771 patch_prober.go:28] interesting pod/router-default-5444994796-5jvq8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 15:19:48 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Mar 19 15:19:48 crc kubenswrapper[4771]: [+]process-running ok Mar 19 15:19:48 crc kubenswrapper[4771]: healthz check failed Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.744644 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5jvq8" podUID="0c7b96e7-c74b-4608-a81f-92a7d977c7d9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.756170 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2vr9c"] Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.767860 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bwgwn"] Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.844439 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-wwpj8" Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.844512 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:48 crc kubenswrapper[4771]: E0319 15:19:48.844859 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:49.344844882 +0000 UTC m=+248.573466074 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.947444 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:48 crc kubenswrapper[4771]: E0319 15:19:48.948091 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:49.448077989 +0000 UTC m=+248.676699191 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:48 crc kubenswrapper[4771]: I0319 15:19:48.957148 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ngxwv" event={"ID":"d16bbb1e-ba55-4380-a503-e67f57dae69a","Type":"ContainerStarted","Data":"0113df3785e4e73d52cc42f153a7ba8d2c5c9fcb8a165864400b8eff1d1de34f"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.017967 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-wbmz4" event={"ID":"47dd3e68-6697-43ba-b530-61447a1fe1e7","Type":"ContainerStarted","Data":"9df6ac6b01f09378089c39d73e1cecf33e475659e9b5648e4c2069d6f401e955"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.046274 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lxmpv" event={"ID":"f790bc61-0a88-45de-b672-76b356fb8522","Type":"ContainerStarted","Data":"68bed1762d7c9453cd542a4cfa415e8d92da7ead31ba04e9a153b85a4f64e2c9"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.047307 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lxmpv" Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.048213 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:49 crc kubenswrapper[4771]: E0319 15:19:49.048592 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:49.548575557 +0000 UTC m=+248.777196759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.053855 4771 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-lxmpv container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.054361 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lxmpv" podUID="f790bc61-0a88-45de-b672-76b356fb8522" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.075878 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" event={"ID":"1c75a371-7547-47b8-ac59-d296d642cd5c","Type":"ContainerStarted","Data":"8d20dedd411b462d60001bec9b59c8beb4c3e6e0ec6e5fd78ee722029b698a16"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.075956 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" event={"ID":"1c75a371-7547-47b8-ac59-d296d642cd5c","Type":"ContainerStarted","Data":"a407eeaad6ca3fa7451d0386fac8bf77196e7daf4b1e7dbf5c1e8315f0b19204"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.087622 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lxmpv" podStartSLOduration=189.087593347 podStartE2EDuration="3m9.087593347s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:49.082383364 +0000 UTC m=+248.311004566" watchObservedRunningTime="2026-03-19 15:19:49.087593347 +0000 UTC m=+248.316214549" Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.101459 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7fqvf" event={"ID":"b4eb7061-dde4-44f1-943a-219d2f4f5071","Type":"ContainerStarted","Data":"e5431da28efcb15adba752f5d58f4db6fdc61a9c5d1fde67d2e0b8a038714652"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.102253 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-7fqvf" Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.109194 4771 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7fqvf container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.109258 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-7fqvf" podUID="b4eb7061-dde4-44f1-943a-219d2f4f5071" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.127220 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zpxp6" event={"ID":"d9687355-25ac-44eb-ab53-25419b1c24b6","Type":"ContainerStarted","Data":"6bf8f4b073a9fcfb0c643f531bf7536623a8f7ab3d80e715b4e44f81cd44f0d9"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.155799 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:49 crc kubenswrapper[4771]: E0319 15:19:49.158583 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:49.658562715 +0000 UTC m=+248.887183917 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.160037 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bwgwn" event={"ID":"a0231568-cb75-4c4a-be45-e07f0a03c320","Type":"ContainerStarted","Data":"60a1eea10253ffb51a5ba31b39084cc35e8f9c046cf4db9ae1e1bdf6633e1fae"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.162067 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" podStartSLOduration=190.162051254 podStartE2EDuration="3m10.162051254s" podCreationTimestamp="2026-03-19 15:16:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:49.159950271 +0000 UTC m=+248.388571473" watchObservedRunningTime="2026-03-19 15:19:49.162051254 +0000 UTC m=+248.390672456" Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.211235 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-54ctz" event={"ID":"f3dd2372-7717-4dd5-9812-60a628b02dda","Type":"ContainerStarted","Data":"62cef2eb0af2798da146c714407ad873b9ed06b817dbcfdce81ea86b8922f7e8"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.214079 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zpxp6" podStartSLOduration=189.214066603 podStartE2EDuration="3m9.214066603s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:49.212204576 +0000 UTC m=+248.440825778" watchObservedRunningTime="2026-03-19 15:19:49.214066603 +0000 UTC m=+248.442687805" Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.214863 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565555-5tnbp" event={"ID":"7c846868-7bc7-4aca-b36c-b8e85cc31ac2","Type":"ContainerStarted","Data":"a2bf69f4bcf4420f8f65fcac0be1c27067e648779544e8f8a7455249ca08b003"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.218823 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-8827r" event={"ID":"bfd551a7-5bcf-4bfd-a881-0c60d5f8afdf","Type":"ContainerStarted","Data":"a67070a9c11cf33ce99d47a0efd02c2272136079f142188e13fbf0af06d56983"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.272722 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:49 crc kubenswrapper[4771]: E0319 15:19:49.272919 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:49.772894915 +0000 UTC m=+249.001516107 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.274284 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:49 crc kubenswrapper[4771]: E0319 15:19:49.276682 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:49.776674001 +0000 UTC m=+249.005295203 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.281352 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vdfcr" event={"ID":"a212fe68-9219-479d-bf36-26b08daf31ab","Type":"ContainerStarted","Data":"1b5f47f8513efe0d319a8f824cd69f9f454fc8b9e0bdad5c050357e71a8d8239"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.298800 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fs46x" event={"ID":"db1585ab-0970-4bd5-acba-7eda8ed2d40f","Type":"ContainerStarted","Data":"d0e3472f1d1677f39505f24234cda4b51f72e8023f843ddc8e48a91089d98ffd"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.317332 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-7fqvf" podStartSLOduration=189.317312431 podStartE2EDuration="3m9.317312431s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:49.268908483 +0000 UTC m=+248.497529685" watchObservedRunningTime="2026-03-19 15:19:49.317312431 +0000 UTC m=+248.545933633" Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.319226 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29565555-5tnbp" podStartSLOduration=190.319218239 podStartE2EDuration="3m10.319218239s" podCreationTimestamp="2026-03-19 15:16:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:49.316306185 +0000 UTC m=+248.544927387" watchObservedRunningTime="2026-03-19 15:19:49.319218239 +0000 UTC m=+248.547839431" Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.326494 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wrsj5" event={"ID":"0f6015a9-593b-4a7f-a16b-dc415bba5374","Type":"ContainerStarted","Data":"0211b174a27ff36eb0832ea4b823115bba438e0cfe61df2f8989c864ecdf5ebf"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.326547 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wrsj5" event={"ID":"0f6015a9-593b-4a7f-a16b-dc415bba5374","Type":"ContainerStarted","Data":"a68d68827c903af6db4ffa655e2ea2f281ebecb1780d380822b32ec7f9a5350f"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.375702 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:49 crc kubenswrapper[4771]: E0319 15:19:49.377813 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:49.877789254 +0000 UTC m=+249.106410526 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.381007 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vdfcr" podStartSLOduration=189.380976255 podStartE2EDuration="3m9.380976255s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:49.378324927 +0000 UTC m=+248.606946129" watchObservedRunningTime="2026-03-19 15:19:49.380976255 +0000 UTC m=+248.609597457" Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.397121 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qvlnd" event={"ID":"b56fbd59-432d-4672-bc27-0cb80aa81405","Type":"ContainerStarted","Data":"7d69f9dd0c7e966bdb6749084743c4dd0e922bba0c9b6d840a624e59f116bf69"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.397183 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qvlnd" event={"ID":"b56fbd59-432d-4672-bc27-0cb80aa81405","Type":"ContainerStarted","Data":"f8100880260de1958f2026962a22d2cbcf62ea1606575b1ed3dd179e43b7dfa9"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.419887 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wm59s" event={"ID":"53b661a1-d7da-45eb-9861-dca1509920ee","Type":"ContainerStarted","Data":"5abf88ac381b9ab41a15707c284dfa756aa1aafd36e7c91078c9d4fd6a63f91f"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.456340 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rdqrx" event={"ID":"6e1420e8-ae3b-4ef3-98e3-f2b63eb032e1","Type":"ContainerStarted","Data":"6ea37e84c7862ae26483387ef8d79b71661975bca0f2712f70dfb1e666ce0f87"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.480996 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wrsj5" podStartSLOduration=189.48096413 podStartE2EDuration="3m9.48096413s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:49.420030875 +0000 UTC m=+248.648652077" watchObservedRunningTime="2026-03-19 15:19:49.48096413 +0000 UTC m=+248.709585332" Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.481847 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.482286 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qvlnd" podStartSLOduration=189.482278823 podStartE2EDuration="3m9.482278823s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:49.481309229 +0000 UTC m=+248.709930431" watchObservedRunningTime="2026-03-19 15:19:49.482278823 +0000 UTC m=+248.710900025" Mar 19 15:19:49 crc kubenswrapper[4771]: E0319 15:19:49.482920 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:49.982908039 +0000 UTC m=+249.211529241 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.573966 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2f87m" event={"ID":"adf88882-bc11-4ee1-a2ba-cfd13d62b8dd","Type":"ContainerStarted","Data":"e5628e4f29ef48a434d46ee08ed8801e2c0c1c5672d54941eca1c95cefe733a2"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.584538 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:49 crc kubenswrapper[4771]: E0319 15:19:49.584994 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:50.084964177 +0000 UTC m=+249.313585379 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.598130 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tzhdc" event={"ID":"6c6abe54-14f8-429a-9833-9492e274ec41","Type":"ContainerStarted","Data":"4d01f33c95abbf35f33eb6e64d06dff74e30d225153ff7c609519a7727c98e2c"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.598195 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tzhdc" event={"ID":"6c6abe54-14f8-429a-9833-9492e274ec41","Type":"ContainerStarted","Data":"a188dcb4463500a94442027b18d326b10916a43b563dadb9547dc3e13a1bfd18"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.613664 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-k96m2" event={"ID":"98e7c37e-276a-4fae-aa3a-856f6c33e608","Type":"ContainerStarted","Data":"2583c114e031a85c3255d3b59622a2257b706d38862c08b1301e5b3467504218"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.628319 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2f87m" podStartSLOduration=189.628301766 podStartE2EDuration="3m9.628301766s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:49.562491807 +0000 UTC m=+248.791113009" watchObservedRunningTime="2026-03-19 15:19:49.628301766 +0000 UTC m=+248.856922968" Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.639575 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-vm7fz" event={"ID":"31c7ed96-b981-4f85-9f5a-dee62216ecd9","Type":"ContainerStarted","Data":"a21385ea0715f50075b7e55d079a7dbdde10539fabdad043fc1a9a61ff5685ae"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.646834 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hwlpq" event={"ID":"398ecde0-3269-4a96-a831-9762c0fd76b0","Type":"ContainerStarted","Data":"e241690ab874ca40e53b74c9bb2efb823fd523f3396b29b18e4f3ac2c00914c9"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.666816 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-vm7fz" podStartSLOduration=189.666797801 podStartE2EDuration="3m9.666797801s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:49.666412062 +0000 UTC m=+248.895033264" watchObservedRunningTime="2026-03-19 15:19:49.666797801 +0000 UTC m=+248.895419003" Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.687456 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tzhdc" podStartSLOduration=189.687430595 podStartE2EDuration="3m9.687430595s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:49.644770224 +0000 UTC m=+248.873391426" watchObservedRunningTime="2026-03-19 15:19:49.687430595 +0000 UTC m=+248.916051797" Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.696462 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:49 crc kubenswrapper[4771]: E0319 15:19:49.696895 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:50.196880724 +0000 UTC m=+249.425501926 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.722343 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fk4xm" event={"ID":"5847be39-eb3d-4802-8f96-771f91078979","Type":"ContainerStarted","Data":"319a5b47c9a02f61bb753740a65f4d15549874ac5da2c3d8e7b951a06feaad96"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.743191 4771 patch_prober.go:28] interesting pod/router-default-5444994796-5jvq8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 15:19:49 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Mar 19 15:19:49 crc kubenswrapper[4771]: [+]process-running ok Mar 19 15:19:49 crc kubenswrapper[4771]: healthz check failed Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.743256 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5jvq8" podUID="0c7b96e7-c74b-4608-a81f-92a7d977c7d9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.771694 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" event={"ID":"b11452b4-794b-41a4-a700-0b541916c6ad","Type":"ContainerStarted","Data":"211c27760893e40ce88070ae99c8c0baa6d15c69dae660f71e060134092dd39c"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.796927 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:49 crc kubenswrapper[4771]: E0319 15:19:49.797939 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:50.297923106 +0000 UTC m=+249.526544308 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.798898 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-fvhqg" event={"ID":"8796bcce-0185-4494-8485-66476bdde45c","Type":"ContainerStarted","Data":"600692f427f63464eeaf274dfe97829db23f44bdbb49d367ebc3a797bab53dc4"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.798934 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-fvhqg" event={"ID":"8796bcce-0185-4494-8485-66476bdde45c","Type":"ContainerStarted","Data":"15c7d7fbfcca47a446bf74bad74174e03b880ce06b56f30f6b5e9e07755ce22c"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.799663 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fk4xm" podStartSLOduration=189.79963894 podStartE2EDuration="3m9.79963894s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:49.759200834 +0000 UTC m=+248.987822036" watchObservedRunningTime="2026-03-19 15:19:49.79963894 +0000 UTC m=+249.028260142" Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.810116 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" podStartSLOduration=189.810097425 podStartE2EDuration="3m9.810097425s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:49.799510356 +0000 UTC m=+249.028131558" watchObservedRunningTime="2026-03-19 15:19:49.810097425 +0000 UTC m=+249.038718627" Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.824733 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2vr9c" event={"ID":"8d55d3e9-4387-4456-814b-34317b8768f5","Type":"ContainerStarted","Data":"9d53503b719f97522baba1f6899d9f8c4a98f064fe71d1f24d10c1b8f0ef853a"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.858823 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l5rcq" event={"ID":"259b7e23-88c9-452d-a549-a0ccbfacbcb5","Type":"ContainerStarted","Data":"e4e4d61e050a377a46180f1ade4ab1764df1209a0ee4fcc5cf41688dedbad209"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.881105 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b76" event={"ID":"84955373-f1a6-473c-8b85-2f8d4dc29256","Type":"ContainerStarted","Data":"c796171f77713fff534cb23a3daa8ae042f4eb19f946483564183f2069d0c687"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.882060 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b76" Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.882254 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.882279 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.886156 4771 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-q7b76 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" start-of-body= Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.886200 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b76" podUID="84955373-f1a6-473c-8b85-2f8d4dc29256" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.896126 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lm2zv" event={"ID":"40ac0b95-eb22-4b20-8870-b440d5fa12d1","Type":"ContainerStarted","Data":"eae6d9126242571d1484ba5ab45e2f227ac78690c88f539a55eb49ee14d27b2f"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.900325 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l5rcq" podStartSLOduration=189.900310212 podStartE2EDuration="3m9.900310212s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:49.8994479 +0000 UTC m=+249.128069102" watchObservedRunningTime="2026-03-19 15:19:49.900310212 +0000 UTC m=+249.128931414" Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.900539 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-fvhqg" podStartSLOduration=189.900535158 podStartE2EDuration="3m9.900535158s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:49.844774144 +0000 UTC m=+249.073395346" watchObservedRunningTime="2026-03-19 15:19:49.900535158 +0000 UTC m=+249.129156360" Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.904492 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:49 crc kubenswrapper[4771]: E0319 15:19:49.906907 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:50.406894299 +0000 UTC m=+249.635515501 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.920311 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9sf6z" event={"ID":"c09ae4bc-4073-442f-8cbc-0f42ea00a35d","Type":"ContainerStarted","Data":"3e8eb615cde425bf67e38c945a70404869394ea4ccd828a12f236c2ca093a274"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.920561 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9sf6z" event={"ID":"c09ae4bc-4073-442f-8cbc-0f42ea00a35d","Type":"ContainerStarted","Data":"839599cc423bc431b94ced8d733e54cfc26dddf56b4bff939bc1897e8fa6dd88"} Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.920783 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4" podUID="93121b3f-c7da-4e21-8fa0-7f8bf08a05ef" containerName="route-controller-manager" containerID="cri-o://977e9b50246eb18a00dfeb032ae7a306e14e8b9f8e23b71bb3d8cb312eb9f199" gracePeriod=30 Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.920867 4771 patch_prober.go:28] interesting pod/downloads-7954f5f757-m25b8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.920910 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-m25b8" podUID="6f5dca39-298b-4814-b77c-43dd0cbc4025" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.921015 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-jpwl9" podUID="b01a56e9-ee30-4945-b582-5ff927104c4c" containerName="controller-manager" containerID="cri-o://663ab16876d65e71b90cf6f42359db193a0d1ec312fa275d8ef91366cea8078e" gracePeriod=30 Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.960372 4771 ???:1] "http: TLS handshake error from 192.168.126.11:48572: no serving certificate available for the kubelet" Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.960355 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b76" podStartSLOduration=189.960337654 podStartE2EDuration="3m9.960337654s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:49.95073407 +0000 UTC m=+249.179355262" watchObservedRunningTime="2026-03-19 15:19:49.960337654 +0000 UTC m=+249.188958856" Mar 19 15:19:49 crc kubenswrapper[4771]: I0319 15:19:49.976061 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.005664 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:50 crc kubenswrapper[4771]: E0319 15:19:50.007701 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:50.507680255 +0000 UTC m=+249.736301457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.097512 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9sf6z" podStartSLOduration=190.09742423 podStartE2EDuration="3m10.09742423s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:50.021261068 +0000 UTC m=+249.249882270" watchObservedRunningTime="2026-03-19 15:19:50.09742423 +0000 UTC m=+249.326045432" Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.110671 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:50 crc kubenswrapper[4771]: E0319 15:19:50.111066 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:50.611054746 +0000 UTC m=+249.839675948 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.121371 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.122119 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.217161 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:50 crc kubenswrapper[4771]: E0319 15:19:50.217511 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:50.717495074 +0000 UTC m=+249.946116276 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.319190 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:50 crc kubenswrapper[4771]: E0319 15:19:50.319586 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:50.819572791 +0000 UTC m=+250.048193993 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.334940 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.419955 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:50 crc kubenswrapper[4771]: E0319 15:19:50.420332 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:50.920315236 +0000 UTC m=+250.148936438 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.526103 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:50 crc kubenswrapper[4771]: E0319 15:19:50.526506 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:51.026486558 +0000 UTC m=+250.255107760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.636131 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:50 crc kubenswrapper[4771]: E0319 15:19:50.637021 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:51.13700044 +0000 UTC m=+250.365621642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.737677 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:50 crc kubenswrapper[4771]: E0319 15:19:50.738026 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:51.238013911 +0000 UTC m=+250.466635113 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.762661 4771 patch_prober.go:28] interesting pod/router-default-5444994796-5jvq8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 15:19:50 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Mar 19 15:19:50 crc kubenswrapper[4771]: [+]process-running ok Mar 19 15:19:50 crc kubenswrapper[4771]: healthz check failed Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.762719 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5jvq8" podUID="0c7b96e7-c74b-4608-a81f-92a7d977c7d9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.769298 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jpwl9" Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.798895 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8"] Mar 19 15:19:50 crc kubenswrapper[4771]: E0319 15:19:50.799139 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b01a56e9-ee30-4945-b582-5ff927104c4c" containerName="controller-manager" Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.799151 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b01a56e9-ee30-4945-b582-5ff927104c4c" containerName="controller-manager" Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.799259 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b01a56e9-ee30-4945-b582-5ff927104c4c" containerName="controller-manager" Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.799611 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8" Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.831697 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8"] Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.843529 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:50 crc kubenswrapper[4771]: E0319 15:19:50.844184 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:51.344168582 +0000 UTC m=+250.572789784 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.869781 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4" Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.946656 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b01a56e9-ee30-4945-b582-5ff927104c4c-client-ca\") pod \"b01a56e9-ee30-4945-b582-5ff927104c4c\" (UID: \"b01a56e9-ee30-4945-b582-5ff927104c4c\") " Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.947058 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b01a56e9-ee30-4945-b582-5ff927104c4c-config\") pod \"b01a56e9-ee30-4945-b582-5ff927104c4c\" (UID: \"b01a56e9-ee30-4945-b582-5ff927104c4c\") " Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.947114 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b01a56e9-ee30-4945-b582-5ff927104c4c-serving-cert\") pod \"b01a56e9-ee30-4945-b582-5ff927104c4c\" (UID: \"b01a56e9-ee30-4945-b582-5ff927104c4c\") " Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.947154 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b01a56e9-ee30-4945-b582-5ff927104c4c-proxy-ca-bundles\") pod \"b01a56e9-ee30-4945-b582-5ff927104c4c\" (UID: \"b01a56e9-ee30-4945-b582-5ff927104c4c\") " Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.947189 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzzgx\" (UniqueName: \"kubernetes.io/projected/b01a56e9-ee30-4945-b582-5ff927104c4c-kube-api-access-lzzgx\") pod \"b01a56e9-ee30-4945-b582-5ff927104c4c\" (UID: \"b01a56e9-ee30-4945-b582-5ff927104c4c\") " Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.947370 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.947417 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e51098c3-dbac-473d-ab5e-a3adf6018c28-client-ca\") pod \"controller-manager-5f456bfcb7-sx5j8\" (UID: \"e51098c3-dbac-473d-ab5e-a3adf6018c28\") " pod="openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8" Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.947442 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e51098c3-dbac-473d-ab5e-a3adf6018c28-proxy-ca-bundles\") pod \"controller-manager-5f456bfcb7-sx5j8\" (UID: \"e51098c3-dbac-473d-ab5e-a3adf6018c28\") " pod="openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8" Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.947506 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e51098c3-dbac-473d-ab5e-a3adf6018c28-config\") pod \"controller-manager-5f456bfcb7-sx5j8\" (UID: \"e51098c3-dbac-473d-ab5e-a3adf6018c28\") " pod="openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8" Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.947524 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xskff\" (UniqueName: \"kubernetes.io/projected/e51098c3-dbac-473d-ab5e-a3adf6018c28-kube-api-access-xskff\") pod \"controller-manager-5f456bfcb7-sx5j8\" (UID: \"e51098c3-dbac-473d-ab5e-a3adf6018c28\") " pod="openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8" Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.947550 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e51098c3-dbac-473d-ab5e-a3adf6018c28-serving-cert\") pod \"controller-manager-5f456bfcb7-sx5j8\" (UID: \"e51098c3-dbac-473d-ab5e-a3adf6018c28\") " pod="openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8" Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.948378 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b01a56e9-ee30-4945-b582-5ff927104c4c-client-ca" (OuterVolumeSpecName: "client-ca") pod "b01a56e9-ee30-4945-b582-5ff927104c4c" (UID: "b01a56e9-ee30-4945-b582-5ff927104c4c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.948903 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b01a56e9-ee30-4945-b582-5ff927104c4c-config" (OuterVolumeSpecName: "config") pod "b01a56e9-ee30-4945-b582-5ff927104c4c" (UID: "b01a56e9-ee30-4945-b582-5ff927104c4c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.950548 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b01a56e9-ee30-4945-b582-5ff927104c4c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b01a56e9-ee30-4945-b582-5ff927104c4c" (UID: "b01a56e9-ee30-4945-b582-5ff927104c4c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:19:50 crc kubenswrapper[4771]: E0319 15:19:50.951037 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:51.451021581 +0000 UTC m=+250.679642773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.957609 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b01a56e9-ee30-4945-b582-5ff927104c4c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b01a56e9-ee30-4945-b582-5ff927104c4c" (UID: "b01a56e9-ee30-4945-b582-5ff927104c4c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.965737 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-k96m2" event={"ID":"98e7c37e-276a-4fae-aa3a-856f6c33e608","Type":"ContainerStarted","Data":"f32518f925b3d9cfa711ab22d89e89b3fb1d7e4e902954971df4379cfb15b900"} Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.965799 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-k96m2" event={"ID":"98e7c37e-276a-4fae-aa3a-856f6c33e608","Type":"ContainerStarted","Data":"37bf39908f5f37d2acaebc59e5bf7f9dec33db23c6454757082514409dff33e0"} Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.966675 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b01a56e9-ee30-4945-b582-5ff927104c4c-kube-api-access-lzzgx" (OuterVolumeSpecName: "kube-api-access-lzzgx") pod "b01a56e9-ee30-4945-b582-5ff927104c4c" (UID: "b01a56e9-ee30-4945-b582-5ff927104c4c"). InnerVolumeSpecName "kube-api-access-lzzgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.991509 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hwlpq" event={"ID":"398ecde0-3269-4a96-a831-9762c0fd76b0","Type":"ContainerStarted","Data":"b40393c297cb27948f3a7d6b88dc27a458116b334fd8b03b72ca2247f1cdc77e"} Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.991569 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hwlpq" event={"ID":"398ecde0-3269-4a96-a831-9762c0fd76b0","Type":"ContainerStarted","Data":"7619e26a3bccfb139718cb15c2aaa2f3a030568b70ba2d8038345ec4f398bbe1"} Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.991691 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-hwlpq" Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.998245 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2vr9c" event={"ID":"8d55d3e9-4387-4456-814b-34317b8768f5","Type":"ContainerStarted","Data":"46f927c9f638087e4d0ee7f4726f43d10494df663c93e86b5d209d880cbcc8eb"} Mar 19 15:19:50 crc kubenswrapper[4771]: I0319 15:19:50.998301 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2vr9c" event={"ID":"8d55d3e9-4387-4456-814b-34317b8768f5","Type":"ContainerStarted","Data":"b5dc259beb702dcf03c69d6832abc9f7f2d88ed84e1bc8fefa5118ec20bd1e0e"} Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.009146 4771 generic.go:334] "Generic (PLEG): container finished" podID="b01a56e9-ee30-4945-b582-5ff927104c4c" containerID="663ab16876d65e71b90cf6f42359db193a0d1ec312fa275d8ef91366cea8078e" exitCode=0 Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.009305 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jpwl9" event={"ID":"b01a56e9-ee30-4945-b582-5ff927104c4c","Type":"ContainerDied","Data":"663ab16876d65e71b90cf6f42359db193a0d1ec312fa275d8ef91366cea8078e"} Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.009345 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jpwl9" event={"ID":"b01a56e9-ee30-4945-b582-5ff927104c4c","Type":"ContainerDied","Data":"8f2f0eaf83d2fe1eb66fb2f852ce2a9273643fe22ab57c87f1e36115962174db"} Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.009373 4771 scope.go:117] "RemoveContainer" containerID="663ab16876d65e71b90cf6f42359db193a0d1ec312fa275d8ef91366cea8078e" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.009568 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jpwl9" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.009526 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-k96m2" podStartSLOduration=191.009497794 podStartE2EDuration="3m11.009497794s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:51.003216575 +0000 UTC m=+250.231837777" watchObservedRunningTime="2026-03-19 15:19:51.009497794 +0000 UTC m=+250.238118996" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.043652 4771 generic.go:334] "Generic (PLEG): container finished" podID="53b661a1-d7da-45eb-9861-dca1509920ee" containerID="c928aba78c71a0f7e4d5289547c93575af1113239ca8e5959094d21bc72e3173" exitCode=0 Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.043799 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wm59s" event={"ID":"53b661a1-d7da-45eb-9861-dca1509920ee","Type":"ContainerDied","Data":"c928aba78c71a0f7e4d5289547c93575af1113239ca8e5959094d21bc72e3173"} Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.048354 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfkm6\" (UniqueName: \"kubernetes.io/projected/93121b3f-c7da-4e21-8fa0-7f8bf08a05ef-kube-api-access-hfkm6\") pod \"93121b3f-c7da-4e21-8fa0-7f8bf08a05ef\" (UID: \"93121b3f-c7da-4e21-8fa0-7f8bf08a05ef\") " Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.048422 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93121b3f-c7da-4e21-8fa0-7f8bf08a05ef-serving-cert\") pod \"93121b3f-c7da-4e21-8fa0-7f8bf08a05ef\" (UID: \"93121b3f-c7da-4e21-8fa0-7f8bf08a05ef\") " Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.048566 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.048667 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/93121b3f-c7da-4e21-8fa0-7f8bf08a05ef-client-ca\") pod \"93121b3f-c7da-4e21-8fa0-7f8bf08a05ef\" (UID: \"93121b3f-c7da-4e21-8fa0-7f8bf08a05ef\") " Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.048709 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93121b3f-c7da-4e21-8fa0-7f8bf08a05ef-config\") pod \"93121b3f-c7da-4e21-8fa0-7f8bf08a05ef\" (UID: \"93121b3f-c7da-4e21-8fa0-7f8bf08a05ef\") " Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.048970 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e51098c3-dbac-473d-ab5e-a3adf6018c28-config\") pod \"controller-manager-5f456bfcb7-sx5j8\" (UID: \"e51098c3-dbac-473d-ab5e-a3adf6018c28\") " pod="openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.049022 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xskff\" (UniqueName: \"kubernetes.io/projected/e51098c3-dbac-473d-ab5e-a3adf6018c28-kube-api-access-xskff\") pod \"controller-manager-5f456bfcb7-sx5j8\" (UID: \"e51098c3-dbac-473d-ab5e-a3adf6018c28\") " pod="openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.049059 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e51098c3-dbac-473d-ab5e-a3adf6018c28-serving-cert\") pod \"controller-manager-5f456bfcb7-sx5j8\" (UID: \"e51098c3-dbac-473d-ab5e-a3adf6018c28\") " pod="openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.050006 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93121b3f-c7da-4e21-8fa0-7f8bf08a05ef-client-ca" (OuterVolumeSpecName: "client-ca") pod "93121b3f-c7da-4e21-8fa0-7f8bf08a05ef" (UID: "93121b3f-c7da-4e21-8fa0-7f8bf08a05ef"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.052796 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93121b3f-c7da-4e21-8fa0-7f8bf08a05ef-kube-api-access-hfkm6" (OuterVolumeSpecName: "kube-api-access-hfkm6") pod "93121b3f-c7da-4e21-8fa0-7f8bf08a05ef" (UID: "93121b3f-c7da-4e21-8fa0-7f8bf08a05ef"). InnerVolumeSpecName "kube-api-access-hfkm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.059509 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93121b3f-c7da-4e21-8fa0-7f8bf08a05ef-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "93121b3f-c7da-4e21-8fa0-7f8bf08a05ef" (UID: "93121b3f-c7da-4e21-8fa0-7f8bf08a05ef"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:19:51 crc kubenswrapper[4771]: E0319 15:19:51.059652 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:51.559631395 +0000 UTC m=+250.788252597 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.049129 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e51098c3-dbac-473d-ab5e-a3adf6018c28-client-ca\") pod \"controller-manager-5f456bfcb7-sx5j8\" (UID: \"e51098c3-dbac-473d-ab5e-a3adf6018c28\") " pod="openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.059734 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e51098c3-dbac-473d-ab5e-a3adf6018c28-proxy-ca-bundles\") pod \"controller-manager-5f456bfcb7-sx5j8\" (UID: \"e51098c3-dbac-473d-ab5e-a3adf6018c28\") " pod="openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.060179 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzzgx\" (UniqueName: \"kubernetes.io/projected/b01a56e9-ee30-4945-b582-5ff927104c4c-kube-api-access-lzzgx\") on node \"crc\" DevicePath \"\"" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.060222 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfkm6\" (UniqueName: \"kubernetes.io/projected/93121b3f-c7da-4e21-8fa0-7f8bf08a05ef-kube-api-access-hfkm6\") on node \"crc\" DevicePath \"\"" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.060235 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b01a56e9-ee30-4945-b582-5ff927104c4c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.060254 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93121b3f-c7da-4e21-8fa0-7f8bf08a05ef-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.060275 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b01a56e9-ee30-4945-b582-5ff927104c4c-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.060286 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b01a56e9-ee30-4945-b582-5ff927104c4c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.060295 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b01a56e9-ee30-4945-b582-5ff927104c4c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.060307 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/93121b3f-c7da-4e21-8fa0-7f8bf08a05ef-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.060411 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93121b3f-c7da-4e21-8fa0-7f8bf08a05ef-config" (OuterVolumeSpecName: "config") pod "93121b3f-c7da-4e21-8fa0-7f8bf08a05ef" (UID: "93121b3f-c7da-4e21-8fa0-7f8bf08a05ef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.061936 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2vr9c" podStartSLOduration=191.061922974 podStartE2EDuration="3m11.061922974s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:51.026736522 +0000 UTC m=+250.255357724" watchObservedRunningTime="2026-03-19 15:19:51.061922974 +0000 UTC m=+250.290544176" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.063904 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e51098c3-dbac-473d-ab5e-a3adf6018c28-client-ca\") pod \"controller-manager-5f456bfcb7-sx5j8\" (UID: \"e51098c3-dbac-473d-ab5e-a3adf6018c28\") " pod="openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.064138 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hwlpq" podStartSLOduration=8.064128899 podStartE2EDuration="8.064128899s" podCreationTimestamp="2026-03-19 15:19:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:51.061874783 +0000 UTC m=+250.290495985" watchObservedRunningTime="2026-03-19 15:19:51.064128899 +0000 UTC m=+250.292750101" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.068155 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e51098c3-dbac-473d-ab5e-a3adf6018c28-proxy-ca-bundles\") pod \"controller-manager-5f456bfcb7-sx5j8\" (UID: \"e51098c3-dbac-473d-ab5e-a3adf6018c28\") " pod="openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.068763 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e51098c3-dbac-473d-ab5e-a3adf6018c28-config\") pod \"controller-manager-5f456bfcb7-sx5j8\" (UID: \"e51098c3-dbac-473d-ab5e-a3adf6018c28\") " pod="openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.086325 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e51098c3-dbac-473d-ab5e-a3adf6018c28-serving-cert\") pod \"controller-manager-5f456bfcb7-sx5j8\" (UID: \"e51098c3-dbac-473d-ab5e-a3adf6018c28\") " pod="openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.086486 4771 scope.go:117] "RemoveContainer" containerID="663ab16876d65e71b90cf6f42359db193a0d1ec312fa275d8ef91366cea8078e" Mar 19 15:19:51 crc kubenswrapper[4771]: E0319 15:19:51.087844 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"663ab16876d65e71b90cf6f42359db193a0d1ec312fa275d8ef91366cea8078e\": container with ID starting with 663ab16876d65e71b90cf6f42359db193a0d1ec312fa275d8ef91366cea8078e not found: ID does not exist" containerID="663ab16876d65e71b90cf6f42359db193a0d1ec312fa275d8ef91366cea8078e" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.087868 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"663ab16876d65e71b90cf6f42359db193a0d1ec312fa275d8ef91366cea8078e"} err="failed to get container status \"663ab16876d65e71b90cf6f42359db193a0d1ec312fa275d8ef91366cea8078e\": rpc error: code = NotFound desc = could not find container \"663ab16876d65e71b90cf6f42359db193a0d1ec312fa275d8ef91366cea8078e\": container with ID starting with 663ab16876d65e71b90cf6f42359db193a0d1ec312fa275d8ef91366cea8078e not found: ID does not exist" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.100266 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rdqrx" event={"ID":"6e1420e8-ae3b-4ef3-98e3-f2b63eb032e1","Type":"ContainerStarted","Data":"044ae16b0828b4230287a05d4a85a24e2de3ed4eb96221a65c1954634b282112"} Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.100315 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rdqrx" event={"ID":"6e1420e8-ae3b-4ef3-98e3-f2b63eb032e1","Type":"ContainerStarted","Data":"800e0fd5d13f839d87473a678dff5e9fe5a521e08ed8e67979769693e6d62159"} Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.100506 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rdqrx" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.102672 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xskff\" (UniqueName: \"kubernetes.io/projected/e51098c3-dbac-473d-ab5e-a3adf6018c28-kube-api-access-xskff\") pod \"controller-manager-5f456bfcb7-sx5j8\" (UID: \"e51098c3-dbac-473d-ab5e-a3adf6018c28\") " pod="openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.121361 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jpwl9"] Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.123631 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jpwl9"] Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.127284 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b76" event={"ID":"84955373-f1a6-473c-8b85-2f8d4dc29256","Type":"ContainerStarted","Data":"a694b5e92829f1bc1f73865509b59b16091f32c01f33bffb564260de03465943"} Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.149673 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rdqrx" podStartSLOduration=191.149654827 podStartE2EDuration="3m11.149654827s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:51.146410865 +0000 UTC m=+250.375032067" watchObservedRunningTime="2026-03-19 15:19:51.149654827 +0000 UTC m=+250.378276019" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.155029 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565555-5tnbp" event={"ID":"7c846868-7bc7-4aca-b36c-b8e85cc31ac2","Type":"ContainerStarted","Data":"6d27cff11ca6aa5cb01859ebf55278cc2a592117f476c9260f648dd1cda3c0ef"} Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.162512 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.163342 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.164200 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93121b3f-c7da-4e21-8fa0-7f8bf08a05ef-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:19:51 crc kubenswrapper[4771]: E0319 15:19:51.164468 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:51.664450213 +0000 UTC m=+250.893071415 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.175413 4771 generic.go:334] "Generic (PLEG): container finished" podID="93121b3f-c7da-4e21-8fa0-7f8bf08a05ef" containerID="977e9b50246eb18a00dfeb032ae7a306e14e8b9f8e23b71bb3d8cb312eb9f199" exitCode=0 Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.175490 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4" event={"ID":"93121b3f-c7da-4e21-8fa0-7f8bf08a05ef","Type":"ContainerDied","Data":"977e9b50246eb18a00dfeb032ae7a306e14e8b9f8e23b71bb3d8cb312eb9f199"} Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.175522 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4" event={"ID":"93121b3f-c7da-4e21-8fa0-7f8bf08a05ef","Type":"ContainerDied","Data":"c8d8850098f2ad1d5d38dff8043e6437ffd39fb019ea824056a52baba62ecd4f"} Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.175541 4771 scope.go:117] "RemoveContainer" containerID="977e9b50246eb18a00dfeb032ae7a306e14e8b9f8e23b71bb3d8cb312eb9f199" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.175651 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.198231 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lm2zv" event={"ID":"40ac0b95-eb22-4b20-8870-b440d5fa12d1","Type":"ContainerStarted","Data":"976cc19ce6f3911b16143c99716e1f14ffca269d3edf0effe844cfa6915e050b"} Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.213316 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4"] Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.214279 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hmvx4"] Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.222636 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-54ctz" event={"ID":"f3dd2372-7717-4dd5-9812-60a628b02dda","Type":"ContainerStarted","Data":"79c4ad342cd7add237d6a2b34e442e3d794ef58b411b7236a977d322a1cb8cb4"} Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.223342 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-54ctz" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.235088 4771 scope.go:117] "RemoveContainer" containerID="977e9b50246eb18a00dfeb032ae7a306e14e8b9f8e23b71bb3d8cb312eb9f199" Mar 19 15:19:51 crc kubenswrapper[4771]: E0319 15:19:51.237377 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"977e9b50246eb18a00dfeb032ae7a306e14e8b9f8e23b71bb3d8cb312eb9f199\": container with ID starting with 977e9b50246eb18a00dfeb032ae7a306e14e8b9f8e23b71bb3d8cb312eb9f199 not found: ID does not exist" containerID="977e9b50246eb18a00dfeb032ae7a306e14e8b9f8e23b71bb3d8cb312eb9f199" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.237404 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"977e9b50246eb18a00dfeb032ae7a306e14e8b9f8e23b71bb3d8cb312eb9f199"} err="failed to get container status \"977e9b50246eb18a00dfeb032ae7a306e14e8b9f8e23b71bb3d8cb312eb9f199\": rpc error: code = NotFound desc = could not find container \"977e9b50246eb18a00dfeb032ae7a306e14e8b9f8e23b71bb3d8cb312eb9f199\": container with ID starting with 977e9b50246eb18a00dfeb032ae7a306e14e8b9f8e23b71bb3d8cb312eb9f199 not found: ID does not exist" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.240767 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ngxwv" event={"ID":"d16bbb1e-ba55-4380-a503-e67f57dae69a","Type":"ContainerStarted","Data":"26f649c81f200d0501402d9a0a22481ba1af3c1420f0efab103a7e21dd84915d"} Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.242998 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l5rcq" event={"ID":"259b7e23-88c9-452d-a549-a0ccbfacbcb5","Type":"ContainerStarted","Data":"7dee5f07d6c0c1df7824a75a7d14ad74fa688cca67b8893349f0a0b2d2c622ca"} Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.247312 4771 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7fqvf container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.247370 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-7fqvf" podUID="b4eb7061-dde4-44f1-943a-219d2f4f5071" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.251851 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-54ctz" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.253783 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lxmpv" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.254750 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qkhp" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.266838 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:51 crc kubenswrapper[4771]: E0319 15:19:51.267220 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:51.767204538 +0000 UTC m=+250.995825740 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.310616 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lm2zv" podStartSLOduration=191.310593288 podStartE2EDuration="3m11.310593288s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:51.271485316 +0000 UTC m=+250.500106518" watchObservedRunningTime="2026-03-19 15:19:51.310593288 +0000 UTC m=+250.539214490" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.312034 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-54ctz" podStartSLOduration=191.312026864 podStartE2EDuration="3m11.312026864s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:51.310793993 +0000 UTC m=+250.539415195" watchObservedRunningTime="2026-03-19 15:19:51.312026864 +0000 UTC m=+250.540648066" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.368943 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:51 crc kubenswrapper[4771]: E0319 15:19:51.381694 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:51.8816764 +0000 UTC m=+251.110297602 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.471937 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:51 crc kubenswrapper[4771]: E0319 15:19:51.472818 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:51.972801611 +0000 UTC m=+251.201422813 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.490956 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-ngxwv" podStartSLOduration=8.490937141 podStartE2EDuration="8.490937141s" podCreationTimestamp="2026-03-19 15:19:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:51.488022387 +0000 UTC m=+250.716643589" watchObservedRunningTime="2026-03-19 15:19:51.490937141 +0000 UTC m=+250.719558343" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.558898 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93121b3f-c7da-4e21-8fa0-7f8bf08a05ef" path="/var/lib/kubelet/pods/93121b3f-c7da-4e21-8fa0-7f8bf08a05ef/volumes" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.559551 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b01a56e9-ee30-4945-b582-5ff927104c4c" path="/var/lib/kubelet/pods/b01a56e9-ee30-4945-b582-5ff927104c4c/volumes" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.573842 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:51 crc kubenswrapper[4771]: E0319 15:19:51.574204 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:52.074192032 +0000 UTC m=+251.302813234 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.675226 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:51 crc kubenswrapper[4771]: E0319 15:19:51.675590 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:52.175574482 +0000 UTC m=+251.404195684 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.745810 4771 patch_prober.go:28] interesting pod/apiserver-76f77b778f-rn8sc container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 19 15:19:51 crc kubenswrapper[4771]: [+]log ok Mar 19 15:19:51 crc kubenswrapper[4771]: [+]etcd ok Mar 19 15:19:51 crc kubenswrapper[4771]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 19 15:19:51 crc kubenswrapper[4771]: [+]poststarthook/generic-apiserver-start-informers ok Mar 19 15:19:51 crc kubenswrapper[4771]: [+]poststarthook/max-in-flight-filter ok Mar 19 15:19:51 crc kubenswrapper[4771]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 19 15:19:51 crc kubenswrapper[4771]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 19 15:19:51 crc kubenswrapper[4771]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 19 15:19:51 crc kubenswrapper[4771]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 19 15:19:51 crc kubenswrapper[4771]: [+]poststarthook/project.openshift.io-projectcache ok Mar 19 15:19:51 crc kubenswrapper[4771]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 19 15:19:51 crc kubenswrapper[4771]: [+]poststarthook/openshift.io-startinformers ok Mar 19 15:19:51 crc kubenswrapper[4771]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 19 15:19:51 crc kubenswrapper[4771]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 19 15:19:51 crc kubenswrapper[4771]: livez check failed Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.745943 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" podUID="1c75a371-7547-47b8-ac59-d296d642cd5c" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.755967 4771 patch_prober.go:28] interesting pod/router-default-5444994796-5jvq8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 15:19:51 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Mar 19 15:19:51 crc kubenswrapper[4771]: [+]process-running ok Mar 19 15:19:51 crc kubenswrapper[4771]: healthz check failed Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.770157 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5jvq8" podUID="0c7b96e7-c74b-4608-a81f-92a7d977c7d9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.777555 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:51 crc kubenswrapper[4771]: E0319 15:19:51.778510 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:52.278495401 +0000 UTC m=+251.507116603 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.810498 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8"] Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.880845 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:51 crc kubenswrapper[4771]: E0319 15:19:51.881080 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:52.381048682 +0000 UTC m=+251.609669884 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.882121 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:51 crc kubenswrapper[4771]: E0319 15:19:51.882645 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:52.382622592 +0000 UTC m=+251.611243794 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.957188 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b76" Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.983317 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:51 crc kubenswrapper[4771]: E0319 15:19:51.983572 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:52.48354457 +0000 UTC m=+251.712165772 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:51 crc kubenswrapper[4771]: I0319 15:19:51.983684 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:51 crc kubenswrapper[4771]: E0319 15:19:51.984291 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:52.484283898 +0000 UTC m=+251.712905090 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.085299 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:52 crc kubenswrapper[4771]: E0319 15:19:52.085534 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:52.585475325 +0000 UTC m=+251.814096527 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.085707 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:52 crc kubenswrapper[4771]: E0319 15:19:52.086080 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:52.586066549 +0000 UTC m=+251.814687741 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.187080 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:52 crc kubenswrapper[4771]: E0319 15:19:52.187285 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:52.687258125 +0000 UTC m=+251.915879327 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.187452 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:52 crc kubenswrapper[4771]: E0319 15:19:52.187895 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:52.687888761 +0000 UTC m=+251.916509963 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.257652 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bwgwn" event={"ID":"a0231568-cb75-4c4a-be45-e07f0a03c320","Type":"ContainerStarted","Data":"8fe2816048c1fc95013b994993ea0f8bb20e090ec26003feeaa3100ef47727a9"} Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.261278 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8" event={"ID":"e51098c3-dbac-473d-ab5e-a3adf6018c28","Type":"ContainerStarted","Data":"8359e5ead9c7aecb1d484b35f501e32a86a8eef5b356325ae85c0458ce55d4a2"} Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.261361 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8" event={"ID":"e51098c3-dbac-473d-ab5e-a3adf6018c28","Type":"ContainerStarted","Data":"82b18d4d68cd50b57472ab1b54e9d5857c39e22aabd0814f7a36aa8c7829ae69"} Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.261380 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8" Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.274082 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8" Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.278616 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wm59s" event={"ID":"53b661a1-d7da-45eb-9861-dca1509920ee","Type":"ContainerStarted","Data":"9aa7c0264f876e1e6d312ed3f80d861c476133f61bde73a49dd67c06af983919"} Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.286043 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-7fqvf" Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.290465 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:52 crc kubenswrapper[4771]: E0319 15:19:52.290851 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:52.790832391 +0000 UTC m=+252.019453593 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.294973 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8" podStartSLOduration=4.294959556 podStartE2EDuration="4.294959556s" podCreationTimestamp="2026-03-19 15:19:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:52.294414042 +0000 UTC m=+251.523035244" watchObservedRunningTime="2026-03-19 15:19:52.294959556 +0000 UTC m=+251.523580758" Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.309789 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wm59s" Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.326678 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wm59s" podStartSLOduration=193.32666071 podStartE2EDuration="3m13.32666071s" podCreationTimestamp="2026-03-19 15:16:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:52.324597348 +0000 UTC m=+251.553218550" watchObservedRunningTime="2026-03-19 15:19:52.32666071 +0000 UTC m=+251.555281912" Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.392251 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:52 crc kubenswrapper[4771]: E0319 15:19:52.395596 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:52.895579527 +0000 UTC m=+252.124200729 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.493291 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:52 crc kubenswrapper[4771]: E0319 15:19:52.493489 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:52.993463158 +0000 UTC m=+252.222084360 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.493637 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:52 crc kubenswrapper[4771]: E0319 15:19:52.494024 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:52.994010962 +0000 UTC m=+252.222632164 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.555280 4771 ???:1] "http: TLS handshake error from 192.168.126.11:48586: no serving certificate available for the kubelet" Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.594777 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:52 crc kubenswrapper[4771]: E0319 15:19:52.594980 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:53.094938492 +0000 UTC m=+252.323559694 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.595473 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:52 crc kubenswrapper[4771]: E0319 15:19:52.595952 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:53.095937376 +0000 UTC m=+252.324558578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.676124 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rlzb8"] Mar 19 15:19:52 crc kubenswrapper[4771]: E0319 15:19:52.676361 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93121b3f-c7da-4e21-8fa0-7f8bf08a05ef" containerName="route-controller-manager" Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.676741 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="93121b3f-c7da-4e21-8fa0-7f8bf08a05ef" containerName="route-controller-manager" Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.676895 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="93121b3f-c7da-4e21-8fa0-7f8bf08a05ef" containerName="route-controller-manager" Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.677692 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rlzb8" Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.682236 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.693898 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rlzb8"] Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.697849 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:52 crc kubenswrapper[4771]: E0319 15:19:52.698112 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:53.198086936 +0000 UTC m=+252.426708138 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.699507 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:52 crc kubenswrapper[4771]: E0319 15:19:52.700048 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:53.200038076 +0000 UTC m=+252.428659278 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.744231 4771 patch_prober.go:28] interesting pod/router-default-5444994796-5jvq8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 15:19:52 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Mar 19 15:19:52 crc kubenswrapper[4771]: [+]process-running ok Mar 19 15:19:52 crc kubenswrapper[4771]: healthz check failed Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.744320 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5jvq8" podUID="0c7b96e7-c74b-4608-a81f-92a7d977c7d9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.801283 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:52 crc kubenswrapper[4771]: E0319 15:19:52.801490 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:53.301462848 +0000 UTC m=+252.530084050 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.801544 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b49408ed-5087-4cb2-b70e-391c32aad069-utilities\") pod \"certified-operators-rlzb8\" (UID: \"b49408ed-5087-4cb2-b70e-391c32aad069\") " pod="openshift-marketplace/certified-operators-rlzb8" Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.801618 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9mmh\" (UniqueName: \"kubernetes.io/projected/b49408ed-5087-4cb2-b70e-391c32aad069-kube-api-access-x9mmh\") pod \"certified-operators-rlzb8\" (UID: \"b49408ed-5087-4cb2-b70e-391c32aad069\") " pod="openshift-marketplace/certified-operators-rlzb8" Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.801644 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b49408ed-5087-4cb2-b70e-391c32aad069-catalog-content\") pod \"certified-operators-rlzb8\" (UID: \"b49408ed-5087-4cb2-b70e-391c32aad069\") " pod="openshift-marketplace/certified-operators-rlzb8" Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.801666 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:52 crc kubenswrapper[4771]: E0319 15:19:52.802016 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:53.302003801 +0000 UTC m=+252.530625003 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.876348 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4fmgq"] Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.877590 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4fmgq" Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.879527 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.892190 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4fmgq"] Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.904679 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:52 crc kubenswrapper[4771]: E0319 15:19:52.904907 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:53.40487444 +0000 UTC m=+252.633495642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.904947 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9mmh\" (UniqueName: \"kubernetes.io/projected/b49408ed-5087-4cb2-b70e-391c32aad069-kube-api-access-x9mmh\") pod \"certified-operators-rlzb8\" (UID: \"b49408ed-5087-4cb2-b70e-391c32aad069\") " pod="openshift-marketplace/certified-operators-rlzb8" Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.905064 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.905090 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b49408ed-5087-4cb2-b70e-391c32aad069-catalog-content\") pod \"certified-operators-rlzb8\" (UID: \"b49408ed-5087-4cb2-b70e-391c32aad069\") " pod="openshift-marketplace/certified-operators-rlzb8" Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.905257 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b49408ed-5087-4cb2-b70e-391c32aad069-utilities\") pod \"certified-operators-rlzb8\" (UID: \"b49408ed-5087-4cb2-b70e-391c32aad069\") " pod="openshift-marketplace/certified-operators-rlzb8" Mar 19 15:19:52 crc kubenswrapper[4771]: E0319 15:19:52.905392 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:53.405378012 +0000 UTC m=+252.633999214 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.905724 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b49408ed-5087-4cb2-b70e-391c32aad069-utilities\") pod \"certified-operators-rlzb8\" (UID: \"b49408ed-5087-4cb2-b70e-391c32aad069\") " pod="openshift-marketplace/certified-operators-rlzb8" Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.905915 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b49408ed-5087-4cb2-b70e-391c32aad069-catalog-content\") pod \"certified-operators-rlzb8\" (UID: \"b49408ed-5087-4cb2-b70e-391c32aad069\") " pod="openshift-marketplace/certified-operators-rlzb8" Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.941299 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9mmh\" (UniqueName: \"kubernetes.io/projected/b49408ed-5087-4cb2-b70e-391c32aad069-kube-api-access-x9mmh\") pod \"certified-operators-rlzb8\" (UID: \"b49408ed-5087-4cb2-b70e-391c32aad069\") " pod="openshift-marketplace/certified-operators-rlzb8" Mar 19 15:19:52 crc kubenswrapper[4771]: I0319 15:19:52.993546 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rlzb8" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.007382 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.007564 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae9495d8-bbe9-4f54-8c12-56b9f40530e1-utilities\") pod \"community-operators-4fmgq\" (UID: \"ae9495d8-bbe9-4f54-8c12-56b9f40530e1\") " pod="openshift-marketplace/community-operators-4fmgq" Mar 19 15:19:53 crc kubenswrapper[4771]: E0319 15:19:53.007660 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:53.507614864 +0000 UTC m=+252.736236066 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.007731 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl5fw\" (UniqueName: \"kubernetes.io/projected/ae9495d8-bbe9-4f54-8c12-56b9f40530e1-kube-api-access-dl5fw\") pod \"community-operators-4fmgq\" (UID: \"ae9495d8-bbe9-4f54-8c12-56b9f40530e1\") " pod="openshift-marketplace/community-operators-4fmgq" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.007797 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae9495d8-bbe9-4f54-8c12-56b9f40530e1-catalog-content\") pod \"community-operators-4fmgq\" (UID: \"ae9495d8-bbe9-4f54-8c12-56b9f40530e1\") " pod="openshift-marketplace/community-operators-4fmgq" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.008264 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:53 crc kubenswrapper[4771]: E0319 15:19:53.008799 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:53.508766493 +0000 UTC m=+252.737387695 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.027084 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.027153 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.086495 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r6j98"] Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.096636 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r6j98" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.100855 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r6j98"] Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.108819 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:53 crc kubenswrapper[4771]: E0319 15:19:53.109062 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:53.609028076 +0000 UTC m=+252.837649278 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.109160 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.109203 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae9495d8-bbe9-4f54-8c12-56b9f40530e1-utilities\") pod \"community-operators-4fmgq\" (UID: \"ae9495d8-bbe9-4f54-8c12-56b9f40530e1\") " pod="openshift-marketplace/community-operators-4fmgq" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.109227 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl5fw\" (UniqueName: \"kubernetes.io/projected/ae9495d8-bbe9-4f54-8c12-56b9f40530e1-kube-api-access-dl5fw\") pod \"community-operators-4fmgq\" (UID: \"ae9495d8-bbe9-4f54-8c12-56b9f40530e1\") " pod="openshift-marketplace/community-operators-4fmgq" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.109246 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae9495d8-bbe9-4f54-8c12-56b9f40530e1-catalog-content\") pod \"community-operators-4fmgq\" (UID: \"ae9495d8-bbe9-4f54-8c12-56b9f40530e1\") " pod="openshift-marketplace/community-operators-4fmgq" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.109616 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae9495d8-bbe9-4f54-8c12-56b9f40530e1-catalog-content\") pod \"community-operators-4fmgq\" (UID: \"ae9495d8-bbe9-4f54-8c12-56b9f40530e1\") " pod="openshift-marketplace/community-operators-4fmgq" Mar 19 15:19:53 crc kubenswrapper[4771]: E0319 15:19:53.110215 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:53.610202166 +0000 UTC m=+252.838823368 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.112040 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae9495d8-bbe9-4f54-8c12-56b9f40530e1-utilities\") pod \"community-operators-4fmgq\" (UID: \"ae9495d8-bbe9-4f54-8c12-56b9f40530e1\") " pod="openshift-marketplace/community-operators-4fmgq" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.134101 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl5fw\" (UniqueName: \"kubernetes.io/projected/ae9495d8-bbe9-4f54-8c12-56b9f40530e1-kube-api-access-dl5fw\") pod \"community-operators-4fmgq\" (UID: \"ae9495d8-bbe9-4f54-8c12-56b9f40530e1\") " pod="openshift-marketplace/community-operators-4fmgq" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.210940 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.211350 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4fmgq" Mar 19 15:19:53 crc kubenswrapper[4771]: E0319 15:19:53.211459 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:53.711428852 +0000 UTC m=+252.940050054 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.211552 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.211891 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d94fc0b-9a51-41a1-b346-767fa239b631-utilities\") pod \"certified-operators-r6j98\" (UID: \"6d94fc0b-9a51-41a1-b346-767fa239b631\") " pod="openshift-marketplace/certified-operators-r6j98" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.212026 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d94fc0b-9a51-41a1-b346-767fa239b631-catalog-content\") pod \"certified-operators-r6j98\" (UID: \"6d94fc0b-9a51-41a1-b346-767fa239b631\") " pod="openshift-marketplace/certified-operators-r6j98" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.212045 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b68ws\" (UniqueName: \"kubernetes.io/projected/6d94fc0b-9a51-41a1-b346-767fa239b631-kube-api-access-b68ws\") pod \"certified-operators-r6j98\" (UID: \"6d94fc0b-9a51-41a1-b346-767fa239b631\") " pod="openshift-marketplace/certified-operators-r6j98" Mar 19 15:19:53 crc kubenswrapper[4771]: E0319 15:19:53.212216 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:53.712202221 +0000 UTC m=+252.940823423 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.283543 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-566qg"] Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.285296 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-566qg" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.287103 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bwgwn" event={"ID":"a0231568-cb75-4c4a-be45-e07f0a03c320","Type":"ContainerStarted","Data":"6ee79a8191b90649f3db34e11009c76005fb34442645861bd731cd42f5da013c"} Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.294582 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-566qg"] Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.317726 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:53 crc kubenswrapper[4771]: E0319 15:19:53.322624 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:53.82259958 +0000 UTC m=+253.051220782 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.318124 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d94fc0b-9a51-41a1-b346-767fa239b631-utilities\") pod \"certified-operators-r6j98\" (UID: \"6d94fc0b-9a51-41a1-b346-767fa239b631\") " pod="openshift-marketplace/certified-operators-r6j98" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.323051 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d94fc0b-9a51-41a1-b346-767fa239b631-catalog-content\") pod \"certified-operators-r6j98\" (UID: \"6d94fc0b-9a51-41a1-b346-767fa239b631\") " pod="openshift-marketplace/certified-operators-r6j98" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.323077 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b68ws\" (UniqueName: \"kubernetes.io/projected/6d94fc0b-9a51-41a1-b346-767fa239b631-kube-api-access-b68ws\") pod \"certified-operators-r6j98\" (UID: \"6d94fc0b-9a51-41a1-b346-767fa239b631\") " pod="openshift-marketplace/certified-operators-r6j98" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.323231 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:53 crc kubenswrapper[4771]: E0319 15:19:53.323678 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:53.823669118 +0000 UTC m=+253.052290320 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.324207 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d94fc0b-9a51-41a1-b346-767fa239b631-utilities\") pod \"certified-operators-r6j98\" (UID: \"6d94fc0b-9a51-41a1-b346-767fa239b631\") " pod="openshift-marketplace/certified-operators-r6j98" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.325380 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d94fc0b-9a51-41a1-b346-767fa239b631-catalog-content\") pod \"certified-operators-r6j98\" (UID: \"6d94fc0b-9a51-41a1-b346-767fa239b631\") " pod="openshift-marketplace/certified-operators-r6j98" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.330883 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rlzb8"] Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.331733 4771 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.341251 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b68ws\" (UniqueName: \"kubernetes.io/projected/6d94fc0b-9a51-41a1-b346-767fa239b631-kube-api-access-b68ws\") pod \"certified-operators-r6j98\" (UID: \"6d94fc0b-9a51-41a1-b346-767fa239b631\") " pod="openshift-marketplace/certified-operators-r6j98" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.410181 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.414251 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r6j98" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.424249 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.424583 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b59b88d-6b8a-43dd-ae40-3091d533b8ae-utilities\") pod \"community-operators-566qg\" (UID: \"9b59b88d-6b8a-43dd-ae40-3091d533b8ae\") " pod="openshift-marketplace/community-operators-566qg" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.424910 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h947r\" (UniqueName: \"kubernetes.io/projected/9b59b88d-6b8a-43dd-ae40-3091d533b8ae-kube-api-access-h947r\") pod \"community-operators-566qg\" (UID: \"9b59b88d-6b8a-43dd-ae40-3091d533b8ae\") " pod="openshift-marketplace/community-operators-566qg" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.424972 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b59b88d-6b8a-43dd-ae40-3091d533b8ae-catalog-content\") pod \"community-operators-566qg\" (UID: \"9b59b88d-6b8a-43dd-ae40-3091d533b8ae\") " pod="openshift-marketplace/community-operators-566qg" Mar 19 15:19:53 crc kubenswrapper[4771]: E0319 15:19:53.425737 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:53.925714915 +0000 UTC m=+253.154336117 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.527552 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.527631 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b59b88d-6b8a-43dd-ae40-3091d533b8ae-utilities\") pod \"community-operators-566qg\" (UID: \"9b59b88d-6b8a-43dd-ae40-3091d533b8ae\") " pod="openshift-marketplace/community-operators-566qg" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.527704 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h947r\" (UniqueName: \"kubernetes.io/projected/9b59b88d-6b8a-43dd-ae40-3091d533b8ae-kube-api-access-h947r\") pod \"community-operators-566qg\" (UID: \"9b59b88d-6b8a-43dd-ae40-3091d533b8ae\") " pod="openshift-marketplace/community-operators-566qg" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.527753 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b59b88d-6b8a-43dd-ae40-3091d533b8ae-catalog-content\") pod \"community-operators-566qg\" (UID: \"9b59b88d-6b8a-43dd-ae40-3091d533b8ae\") " pod="openshift-marketplace/community-operators-566qg" Mar 19 15:19:53 crc kubenswrapper[4771]: E0319 15:19:53.528784 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:54.028763038 +0000 UTC m=+253.257384240 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.529328 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b59b88d-6b8a-43dd-ae40-3091d533b8ae-utilities\") pod \"community-operators-566qg\" (UID: \"9b59b88d-6b8a-43dd-ae40-3091d533b8ae\") " pod="openshift-marketplace/community-operators-566qg" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.541427 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b59b88d-6b8a-43dd-ae40-3091d533b8ae-catalog-content\") pod \"community-operators-566qg\" (UID: \"9b59b88d-6b8a-43dd-ae40-3091d533b8ae\") " pod="openshift-marketplace/community-operators-566qg" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.564154 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h947r\" (UniqueName: \"kubernetes.io/projected/9b59b88d-6b8a-43dd-ae40-3091d533b8ae-kube-api-access-h947r\") pod \"community-operators-566qg\" (UID: \"9b59b88d-6b8a-43dd-ae40-3091d533b8ae\") " pod="openshift-marketplace/community-operators-566qg" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.565865 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4fmgq"] Mar 19 15:19:53 crc kubenswrapper[4771]: W0319 15:19:53.575155 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae9495d8_bbe9_4f54_8c12_56b9f40530e1.slice/crio-c279eaaf44c6bf82eb37a43943f19c5c5b4c332b02c97ba959f4b9389fc5c5d4 WatchSource:0}: Error finding container c279eaaf44c6bf82eb37a43943f19c5c5b4c332b02c97ba959f4b9389fc5c5d4: Status 404 returned error can't find the container with id c279eaaf44c6bf82eb37a43943f19c5c5b4c332b02c97ba959f4b9389fc5c5d4 Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.615421 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-566qg" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.639417 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:53 crc kubenswrapper[4771]: E0319 15:19:53.640650 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 15:19:54.140622503 +0000 UTC m=+253.369243705 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.701422 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk"] Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.702342 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.707414 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.707829 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.708081 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.708232 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.708511 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.708838 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.728282 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk"] Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.742850 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:53 crc kubenswrapper[4771]: E0319 15:19:53.743243 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 15:19:54.243228695 +0000 UTC m=+253.471849897 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hcvsv" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.753562 4771 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-19T15:19:53.331818294Z","Handler":null,"Name":""} Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.758906 4771 patch_prober.go:28] interesting pod/router-default-5444994796-5jvq8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 15:19:53 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Mar 19 15:19:53 crc kubenswrapper[4771]: [+]process-running ok Mar 19 15:19:53 crc kubenswrapper[4771]: healthz check failed Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.758962 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5jvq8" podUID="0c7b96e7-c74b-4608-a81f-92a7d977c7d9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.781287 4771 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.781339 4771 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.839929 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r6j98"] Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.847337 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.847724 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b28a3071-870c-46a0-ab9a-cf643ab2cb64-serving-cert\") pod \"route-controller-manager-6845cb6fb-6nflk\" (UID: \"b28a3071-870c-46a0-ab9a-cf643ab2cb64\") " pod="openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.848426 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b28a3071-870c-46a0-ab9a-cf643ab2cb64-client-ca\") pod \"route-controller-manager-6845cb6fb-6nflk\" (UID: \"b28a3071-870c-46a0-ab9a-cf643ab2cb64\") " pod="openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.848459 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d24zg\" (UniqueName: \"kubernetes.io/projected/b28a3071-870c-46a0-ab9a-cf643ab2cb64-kube-api-access-d24zg\") pod \"route-controller-manager-6845cb6fb-6nflk\" (UID: \"b28a3071-870c-46a0-ab9a-cf643ab2cb64\") " pod="openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.848498 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b28a3071-870c-46a0-ab9a-cf643ab2cb64-config\") pod \"route-controller-manager-6845cb6fb-6nflk\" (UID: \"b28a3071-870c-46a0-ab9a-cf643ab2cb64\") " pod="openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.869159 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.955728 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.956166 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b28a3071-870c-46a0-ab9a-cf643ab2cb64-serving-cert\") pod \"route-controller-manager-6845cb6fb-6nflk\" (UID: \"b28a3071-870c-46a0-ab9a-cf643ab2cb64\") " pod="openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.956223 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b28a3071-870c-46a0-ab9a-cf643ab2cb64-client-ca\") pod \"route-controller-manager-6845cb6fb-6nflk\" (UID: \"b28a3071-870c-46a0-ab9a-cf643ab2cb64\") " pod="openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.956255 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d24zg\" (UniqueName: \"kubernetes.io/projected/b28a3071-870c-46a0-ab9a-cf643ab2cb64-kube-api-access-d24zg\") pod \"route-controller-manager-6845cb6fb-6nflk\" (UID: \"b28a3071-870c-46a0-ab9a-cf643ab2cb64\") " pod="openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.956327 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b28a3071-870c-46a0-ab9a-cf643ab2cb64-config\") pod \"route-controller-manager-6845cb6fb-6nflk\" (UID: \"b28a3071-870c-46a0-ab9a-cf643ab2cb64\") " pod="openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.957554 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b28a3071-870c-46a0-ab9a-cf643ab2cb64-config\") pod \"route-controller-manager-6845cb6fb-6nflk\" (UID: \"b28a3071-870c-46a0-ab9a-cf643ab2cb64\") " pod="openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.958779 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b28a3071-870c-46a0-ab9a-cf643ab2cb64-client-ca\") pod \"route-controller-manager-6845cb6fb-6nflk\" (UID: \"b28a3071-870c-46a0-ab9a-cf643ab2cb64\") " pod="openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.965970 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b28a3071-870c-46a0-ab9a-cf643ab2cb64-serving-cert\") pod \"route-controller-manager-6845cb6fb-6nflk\" (UID: \"b28a3071-870c-46a0-ab9a-cf643ab2cb64\") " pod="openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.966563 4771 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.966611 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.975442 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d24zg\" (UniqueName: \"kubernetes.io/projected/b28a3071-870c-46a0-ab9a-cf643ab2cb64-kube-api-access-d24zg\") pod \"route-controller-manager-6845cb6fb-6nflk\" (UID: \"b28a3071-870c-46a0-ab9a-cf643ab2cb64\") " pod="openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk" Mar 19 15:19:53 crc kubenswrapper[4771]: I0319 15:19:53.997143 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hcvsv\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.051470 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk" Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.126216 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.181476 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-566qg"] Mar 19 15:19:54 crc kubenswrapper[4771]: W0319 15:19:54.187756 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b59b88d_6b8a_43dd_ae40_3091d533b8ae.slice/crio-9fdfa9fd900f326db938c19c2f7cd613357f9b0ba350d21eb70dd2a28956c1c3 WatchSource:0}: Error finding container 9fdfa9fd900f326db938c19c2f7cd613357f9b0ba350d21eb70dd2a28956c1c3: Status 404 returned error can't find the container with id 9fdfa9fd900f326db938c19c2f7cd613357f9b0ba350d21eb70dd2a28956c1c3 Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.263719 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk"] Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.303314 4771 generic.go:334] "Generic (PLEG): container finished" podID="b49408ed-5087-4cb2-b70e-391c32aad069" containerID="34d520fb2800a2983838098311054ab96e10ffcc3bd3d8b8211877ab06849401" exitCode=0 Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.303390 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlzb8" event={"ID":"b49408ed-5087-4cb2-b70e-391c32aad069","Type":"ContainerDied","Data":"34d520fb2800a2983838098311054ab96e10ffcc3bd3d8b8211877ab06849401"} Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.303426 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlzb8" event={"ID":"b49408ed-5087-4cb2-b70e-391c32aad069","Type":"ContainerStarted","Data":"c01c5999bed964b7f7e8295008a402360e7143c4f7b39d330d5e3982c5d2b86a"} Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.311722 4771 generic.go:334] "Generic (PLEG): container finished" podID="ae9495d8-bbe9-4f54-8c12-56b9f40530e1" containerID="d3392fcd9bb5d55d9e1d9463a62d71d660adfb386daffc78ed87972fb4ffc8b4" exitCode=0 Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.311791 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4fmgq" event={"ID":"ae9495d8-bbe9-4f54-8c12-56b9f40530e1","Type":"ContainerDied","Data":"d3392fcd9bb5d55d9e1d9463a62d71d660adfb386daffc78ed87972fb4ffc8b4"} Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.311823 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4fmgq" event={"ID":"ae9495d8-bbe9-4f54-8c12-56b9f40530e1","Type":"ContainerStarted","Data":"c279eaaf44c6bf82eb37a43943f19c5c5b4c332b02c97ba959f4b9389fc5c5d4"} Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.333656 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bwgwn" event={"ID":"a0231568-cb75-4c4a-be45-e07f0a03c320","Type":"ContainerStarted","Data":"7cae019dbe35c0911f78265635ce9b72e299e944697e7e78a7aebb0a5bcdadab"} Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.333738 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bwgwn" event={"ID":"a0231568-cb75-4c4a-be45-e07f0a03c320","Type":"ContainerStarted","Data":"4b1873611d6fc13ebbde4854ffcd46c5e2c43379607cb8c49c0c8bc2cbe517d1"} Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.336528 4771 generic.go:334] "Generic (PLEG): container finished" podID="6d94fc0b-9a51-41a1-b346-767fa239b631" containerID="a6d9502b56a75d5027dea0c6187c409ea0e2713fb292f9eed7c550ba61df0c24" exitCode=0 Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.336608 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6j98" event={"ID":"6d94fc0b-9a51-41a1-b346-767fa239b631","Type":"ContainerDied","Data":"a6d9502b56a75d5027dea0c6187c409ea0e2713fb292f9eed7c550ba61df0c24"} Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.341108 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6j98" event={"ID":"6d94fc0b-9a51-41a1-b346-767fa239b631","Type":"ContainerStarted","Data":"69d64d35df41d0ecb191c844883fbd0ea9d72f539519d976a24db24cb53d4462"} Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.341155 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-566qg" event={"ID":"9b59b88d-6b8a-43dd-ae40-3091d533b8ae","Type":"ContainerStarted","Data":"9fdfa9fd900f326db938c19c2f7cd613357f9b0ba350d21eb70dd2a28956c1c3"} Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.346526 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wm59s" Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.392371 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-bwgwn" podStartSLOduration=11.392352883 podStartE2EDuration="11.392352883s" podCreationTimestamp="2026-03-19 15:19:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:54.39066558 +0000 UTC m=+253.619286782" watchObservedRunningTime="2026-03-19 15:19:54.392352883 +0000 UTC m=+253.620974085" Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.425774 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hcvsv"] Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.680797 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dmtnc"] Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.682356 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dmtnc" Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.687932 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.690817 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dmtnc"] Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.740966 4771 patch_prober.go:28] interesting pod/router-default-5444994796-5jvq8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 15:19:54 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Mar 19 15:19:54 crc kubenswrapper[4771]: [+]process-running ok Mar 19 15:19:54 crc kubenswrapper[4771]: healthz check failed Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.741068 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5jvq8" podUID="0c7b96e7-c74b-4608-a81f-92a7d977c7d9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.774000 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65b4c9ce-e8af-4eca-abf5-08149432aaa5-utilities\") pod \"redhat-marketplace-dmtnc\" (UID: \"65b4c9ce-e8af-4eca-abf5-08149432aaa5\") " pod="openshift-marketplace/redhat-marketplace-dmtnc" Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.774039 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65b4c9ce-e8af-4eca-abf5-08149432aaa5-catalog-content\") pod \"redhat-marketplace-dmtnc\" (UID: \"65b4c9ce-e8af-4eca-abf5-08149432aaa5\") " pod="openshift-marketplace/redhat-marketplace-dmtnc" Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.774117 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qw49\" (UniqueName: \"kubernetes.io/projected/65b4c9ce-e8af-4eca-abf5-08149432aaa5-kube-api-access-5qw49\") pod \"redhat-marketplace-dmtnc\" (UID: \"65b4c9ce-e8af-4eca-abf5-08149432aaa5\") " pod="openshift-marketplace/redhat-marketplace-dmtnc" Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.875535 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qw49\" (UniqueName: \"kubernetes.io/projected/65b4c9ce-e8af-4eca-abf5-08149432aaa5-kube-api-access-5qw49\") pod \"redhat-marketplace-dmtnc\" (UID: \"65b4c9ce-e8af-4eca-abf5-08149432aaa5\") " pod="openshift-marketplace/redhat-marketplace-dmtnc" Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.875641 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65b4c9ce-e8af-4eca-abf5-08149432aaa5-utilities\") pod \"redhat-marketplace-dmtnc\" (UID: \"65b4c9ce-e8af-4eca-abf5-08149432aaa5\") " pod="openshift-marketplace/redhat-marketplace-dmtnc" Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.875658 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65b4c9ce-e8af-4eca-abf5-08149432aaa5-catalog-content\") pod \"redhat-marketplace-dmtnc\" (UID: \"65b4c9ce-e8af-4eca-abf5-08149432aaa5\") " pod="openshift-marketplace/redhat-marketplace-dmtnc" Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.876533 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65b4c9ce-e8af-4eca-abf5-08149432aaa5-catalog-content\") pod \"redhat-marketplace-dmtnc\" (UID: \"65b4c9ce-e8af-4eca-abf5-08149432aaa5\") " pod="openshift-marketplace/redhat-marketplace-dmtnc" Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.877116 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65b4c9ce-e8af-4eca-abf5-08149432aaa5-utilities\") pod \"redhat-marketplace-dmtnc\" (UID: \"65b4c9ce-e8af-4eca-abf5-08149432aaa5\") " pod="openshift-marketplace/redhat-marketplace-dmtnc" Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.898728 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qw49\" (UniqueName: \"kubernetes.io/projected/65b4c9ce-e8af-4eca-abf5-08149432aaa5-kube-api-access-5qw49\") pod \"redhat-marketplace-dmtnc\" (UID: \"65b4c9ce-e8af-4eca-abf5-08149432aaa5\") " pod="openshift-marketplace/redhat-marketplace-dmtnc" Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.927371 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-h97xq" Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.927427 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-h97xq" Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.928865 4771 patch_prober.go:28] interesting pod/console-f9d7485db-h97xq container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 19 15:19:54 crc kubenswrapper[4771]: I0319 15:19:54.928918 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-h97xq" podUID="6dc754c0-8f17-402b-9bd4-be033eb940ba" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.008348 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dmtnc" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.082625 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g7zl5"] Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.084007 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g7zl5" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.096291 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g7zl5"] Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.124386 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.133215 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-rn8sc" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.180770 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxvd6\" (UniqueName: \"kubernetes.io/projected/cc023f86-d23c-4ca1-810a-de7ece9bb340-kube-api-access-kxvd6\") pod \"redhat-marketplace-g7zl5\" (UID: \"cc023f86-d23c-4ca1-810a-de7ece9bb340\") " pod="openshift-marketplace/redhat-marketplace-g7zl5" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.180839 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc023f86-d23c-4ca1-810a-de7ece9bb340-utilities\") pod \"redhat-marketplace-g7zl5\" (UID: \"cc023f86-d23c-4ca1-810a-de7ece9bb340\") " pod="openshift-marketplace/redhat-marketplace-g7zl5" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.180864 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc023f86-d23c-4ca1-810a-de7ece9bb340-catalog-content\") pod \"redhat-marketplace-g7zl5\" (UID: \"cc023f86-d23c-4ca1-810a-de7ece9bb340\") " pod="openshift-marketplace/redhat-marketplace-g7zl5" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.282032 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc023f86-d23c-4ca1-810a-de7ece9bb340-utilities\") pod \"redhat-marketplace-g7zl5\" (UID: \"cc023f86-d23c-4ca1-810a-de7ece9bb340\") " pod="openshift-marketplace/redhat-marketplace-g7zl5" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.282085 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc023f86-d23c-4ca1-810a-de7ece9bb340-catalog-content\") pod \"redhat-marketplace-g7zl5\" (UID: \"cc023f86-d23c-4ca1-810a-de7ece9bb340\") " pod="openshift-marketplace/redhat-marketplace-g7zl5" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.282260 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxvd6\" (UniqueName: \"kubernetes.io/projected/cc023f86-d23c-4ca1-810a-de7ece9bb340-kube-api-access-kxvd6\") pod \"redhat-marketplace-g7zl5\" (UID: \"cc023f86-d23c-4ca1-810a-de7ece9bb340\") " pod="openshift-marketplace/redhat-marketplace-g7zl5" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.282619 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc023f86-d23c-4ca1-810a-de7ece9bb340-utilities\") pod \"redhat-marketplace-g7zl5\" (UID: \"cc023f86-d23c-4ca1-810a-de7ece9bb340\") " pod="openshift-marketplace/redhat-marketplace-g7zl5" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.282854 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc023f86-d23c-4ca1-810a-de7ece9bb340-catalog-content\") pod \"redhat-marketplace-g7zl5\" (UID: \"cc023f86-d23c-4ca1-810a-de7ece9bb340\") " pod="openshift-marketplace/redhat-marketplace-g7zl5" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.319446 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxvd6\" (UniqueName: \"kubernetes.io/projected/cc023f86-d23c-4ca1-810a-de7ece9bb340-kube-api-access-kxvd6\") pod \"redhat-marketplace-g7zl5\" (UID: \"cc023f86-d23c-4ca1-810a-de7ece9bb340\") " pod="openshift-marketplace/redhat-marketplace-g7zl5" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.332925 4771 patch_prober.go:28] interesting pod/downloads-7954f5f757-m25b8 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.333002 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-m25b8" podUID="6f5dca39-298b-4814-b77c-43dd0cbc4025" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.333022 4771 patch_prober.go:28] interesting pod/downloads-7954f5f757-m25b8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.333086 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-m25b8" podUID="6f5dca39-298b-4814-b77c-43dd0cbc4025" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.351102 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" event={"ID":"e2f99f52-00ff-42f0-a2ee-122235c86b2b","Type":"ContainerStarted","Data":"2841347aafecaf651da70723292ec8c3f5c9e21c870e998848b82b50e9aa2c47"} Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.351150 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" event={"ID":"e2f99f52-00ff-42f0-a2ee-122235c86b2b","Type":"ContainerStarted","Data":"3aad2894516c06b69764cb2abf60bea5f53f4d659ea3ef0ff9e4c39a2e3e05ec"} Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.352127 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.369379 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk" event={"ID":"b28a3071-870c-46a0-ab9a-cf643ab2cb64","Type":"ContainerStarted","Data":"3648d2df29aba1b48da9f08be625e2aea187022a5840e768f6152976bf3630aa"} Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.369435 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk" event={"ID":"b28a3071-870c-46a0-ab9a-cf643ab2cb64","Type":"ContainerStarted","Data":"396786c735d77e9cafb68efb343c576f2873eb506ae67ab9692cb6bf83fbc874"} Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.370351 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.371428 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.375815 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" podStartSLOduration=195.375800638 podStartE2EDuration="3m15.375800638s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:55.370741079 +0000 UTC m=+254.599362281" watchObservedRunningTime="2026-03-19 15:19:55.375800638 +0000 UTC m=+254.604421840" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.378129 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.378226 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.383099 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.383210 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.383447 4771 generic.go:334] "Generic (PLEG): container finished" podID="7c846868-7bc7-4aca-b36c-b8e85cc31ac2" containerID="6d27cff11ca6aa5cb01859ebf55278cc2a592117f476c9260f648dd1cda3c0ef" exitCode=0 Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.383537 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565555-5tnbp" event={"ID":"7c846868-7bc7-4aca-b36c-b8e85cc31ac2","Type":"ContainerDied","Data":"6d27cff11ca6aa5cb01859ebf55278cc2a592117f476c9260f648dd1cda3c0ef"} Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.383640 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.403601 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g7zl5" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.408540 4771 generic.go:334] "Generic (PLEG): container finished" podID="9b59b88d-6b8a-43dd-ae40-3091d533b8ae" containerID="cb6b3e16022f09819a9b95db9c47ccbbd7c7536db00bf90365302dc63e377758" exitCode=0 Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.409444 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-566qg" event={"ID":"9b59b88d-6b8a-43dd-ae40-3091d533b8ae","Type":"ContainerDied","Data":"cb6b3e16022f09819a9b95db9c47ccbbd7c7536db00bf90365302dc63e377758"} Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.407621 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk" podStartSLOduration=6.407605403 podStartE2EDuration="6.407605403s" podCreationTimestamp="2026-03-19 15:19:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:19:55.404868914 +0000 UTC m=+254.633490116" watchObservedRunningTime="2026-03-19 15:19:55.407605403 +0000 UTC m=+254.636226605" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.484841 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0532d316-51bb-428d-886b-7609738ab65f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0532d316-51bb-428d-886b-7609738ab65f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.484931 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0532d316-51bb-428d-886b-7609738ab65f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0532d316-51bb-428d-886b-7609738ab65f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.522590 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.523100 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dmtnc"] Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.588493 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0532d316-51bb-428d-886b-7609738ab65f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0532d316-51bb-428d-886b-7609738ab65f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.589644 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0532d316-51bb-428d-886b-7609738ab65f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0532d316-51bb-428d-886b-7609738ab65f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.590207 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0532d316-51bb-428d-886b-7609738ab65f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0532d316-51bb-428d-886b-7609738ab65f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.613081 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0532d316-51bb-428d-886b-7609738ab65f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0532d316-51bb-428d-886b-7609738ab65f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.673655 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g7zl5"] Mar 19 15:19:55 crc kubenswrapper[4771]: W0319 15:19:55.691306 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc023f86_d23c_4ca1_810a_de7ece9bb340.slice/crio-c2c6e67d37b7443339f3db8acc75ab8b026540a050c3c8db3b7abbe492c04503 WatchSource:0}: Error finding container c2c6e67d37b7443339f3db8acc75ab8b026540a050c3c8db3b7abbe492c04503: Status 404 returned error can't find the container with id c2c6e67d37b7443339f3db8acc75ab8b026540a050c3c8db3b7abbe492c04503 Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.716368 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.740184 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-5jvq8" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.744428 4771 patch_prober.go:28] interesting pod/router-default-5444994796-5jvq8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 15:19:55 crc kubenswrapper[4771]: [+]has-synced ok Mar 19 15:19:55 crc kubenswrapper[4771]: [+]process-running ok Mar 19 15:19:55 crc kubenswrapper[4771]: healthz check failed Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.744694 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5jvq8" podUID="0c7b96e7-c74b-4608-a81f-92a7d977c7d9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.888054 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tdfqz"] Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.889530 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tdfqz" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.905084 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.906203 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tdfqz"] Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.996548 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f49e754-62f6-4f17-a6ed-fe5e3abe32b4-catalog-content\") pod \"redhat-operators-tdfqz\" (UID: \"7f49e754-62f6-4f17-a6ed-fe5e3abe32b4\") " pod="openshift-marketplace/redhat-operators-tdfqz" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.996922 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f49e754-62f6-4f17-a6ed-fe5e3abe32b4-utilities\") pod \"redhat-operators-tdfqz\" (UID: \"7f49e754-62f6-4f17-a6ed-fe5e3abe32b4\") " pod="openshift-marketplace/redhat-operators-tdfqz" Mar 19 15:19:55 crc kubenswrapper[4771]: I0319 15:19:55.996970 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnv6j\" (UniqueName: \"kubernetes.io/projected/7f49e754-62f6-4f17-a6ed-fe5e3abe32b4-kube-api-access-pnv6j\") pod \"redhat-operators-tdfqz\" (UID: \"7f49e754-62f6-4f17-a6ed-fe5e3abe32b4\") " pod="openshift-marketplace/redhat-operators-tdfqz" Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.066678 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 19 15:19:56 crc kubenswrapper[4771]: W0319 15:19:56.095619 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0532d316_51bb_428d_886b_7609738ab65f.slice/crio-1df489841afbf976c8d1d75baedeb0cabf9159058552af88c1e587e053604d72 WatchSource:0}: Error finding container 1df489841afbf976c8d1d75baedeb0cabf9159058552af88c1e587e053604d72: Status 404 returned error can't find the container with id 1df489841afbf976c8d1d75baedeb0cabf9159058552af88c1e587e053604d72 Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.097705 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnv6j\" (UniqueName: \"kubernetes.io/projected/7f49e754-62f6-4f17-a6ed-fe5e3abe32b4-kube-api-access-pnv6j\") pod \"redhat-operators-tdfqz\" (UID: \"7f49e754-62f6-4f17-a6ed-fe5e3abe32b4\") " pod="openshift-marketplace/redhat-operators-tdfqz" Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.097784 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f49e754-62f6-4f17-a6ed-fe5e3abe32b4-catalog-content\") pod \"redhat-operators-tdfqz\" (UID: \"7f49e754-62f6-4f17-a6ed-fe5e3abe32b4\") " pod="openshift-marketplace/redhat-operators-tdfqz" Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.097848 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f49e754-62f6-4f17-a6ed-fe5e3abe32b4-utilities\") pod \"redhat-operators-tdfqz\" (UID: \"7f49e754-62f6-4f17-a6ed-fe5e3abe32b4\") " pod="openshift-marketplace/redhat-operators-tdfqz" Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.098440 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f49e754-62f6-4f17-a6ed-fe5e3abe32b4-utilities\") pod \"redhat-operators-tdfqz\" (UID: \"7f49e754-62f6-4f17-a6ed-fe5e3abe32b4\") " pod="openshift-marketplace/redhat-operators-tdfqz" Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.101291 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f49e754-62f6-4f17-a6ed-fe5e3abe32b4-catalog-content\") pod \"redhat-operators-tdfqz\" (UID: \"7f49e754-62f6-4f17-a6ed-fe5e3abe32b4\") " pod="openshift-marketplace/redhat-operators-tdfqz" Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.126426 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnv6j\" (UniqueName: \"kubernetes.io/projected/7f49e754-62f6-4f17-a6ed-fe5e3abe32b4-kube-api-access-pnv6j\") pod \"redhat-operators-tdfqz\" (UID: \"7f49e754-62f6-4f17-a6ed-fe5e3abe32b4\") " pod="openshift-marketplace/redhat-operators-tdfqz" Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.209334 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tdfqz" Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.285311 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dkz7z"] Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.288974 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dkz7z" Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.291241 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dkz7z"] Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.401591 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07-catalog-content\") pod \"redhat-operators-dkz7z\" (UID: \"1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07\") " pod="openshift-marketplace/redhat-operators-dkz7z" Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.401629 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07-utilities\") pod \"redhat-operators-dkz7z\" (UID: \"1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07\") " pod="openshift-marketplace/redhat-operators-dkz7z" Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.401704 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k556n\" (UniqueName: \"kubernetes.io/projected/1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07-kube-api-access-k556n\") pod \"redhat-operators-dkz7z\" (UID: \"1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07\") " pod="openshift-marketplace/redhat-operators-dkz7z" Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.417814 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0532d316-51bb-428d-886b-7609738ab65f","Type":"ContainerStarted","Data":"1df489841afbf976c8d1d75baedeb0cabf9159058552af88c1e587e053604d72"} Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.420169 4771 generic.go:334] "Generic (PLEG): container finished" podID="cc023f86-d23c-4ca1-810a-de7ece9bb340" containerID="39a362076694bcd779c42b709b9152b5bbebee011b78abe9e20a0f299b24f2d2" exitCode=0 Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.420249 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7zl5" event={"ID":"cc023f86-d23c-4ca1-810a-de7ece9bb340","Type":"ContainerDied","Data":"39a362076694bcd779c42b709b9152b5bbebee011b78abe9e20a0f299b24f2d2"} Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.420271 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7zl5" event={"ID":"cc023f86-d23c-4ca1-810a-de7ece9bb340","Type":"ContainerStarted","Data":"c2c6e67d37b7443339f3db8acc75ab8b026540a050c3c8db3b7abbe492c04503"} Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.423356 4771 generic.go:334] "Generic (PLEG): container finished" podID="65b4c9ce-e8af-4eca-abf5-08149432aaa5" containerID="2119078b850e192f802ad2becfc2c3a2bf81cb57ca9e20f02765f3702e5ae26a" exitCode=0 Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.423430 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmtnc" event={"ID":"65b4c9ce-e8af-4eca-abf5-08149432aaa5","Type":"ContainerDied","Data":"2119078b850e192f802ad2becfc2c3a2bf81cb57ca9e20f02765f3702e5ae26a"} Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.423509 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmtnc" event={"ID":"65b4c9ce-e8af-4eca-abf5-08149432aaa5","Type":"ContainerStarted","Data":"515d6284b025565bf857a3b989c2bf7975aea54ea553ab963038b1dee40a2a2c"} Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.503094 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07-catalog-content\") pod \"redhat-operators-dkz7z\" (UID: \"1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07\") " pod="openshift-marketplace/redhat-operators-dkz7z" Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.503157 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07-utilities\") pod \"redhat-operators-dkz7z\" (UID: \"1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07\") " pod="openshift-marketplace/redhat-operators-dkz7z" Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.503321 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k556n\" (UniqueName: \"kubernetes.io/projected/1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07-kube-api-access-k556n\") pod \"redhat-operators-dkz7z\" (UID: \"1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07\") " pod="openshift-marketplace/redhat-operators-dkz7z" Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.504678 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07-catalog-content\") pod \"redhat-operators-dkz7z\" (UID: \"1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07\") " pod="openshift-marketplace/redhat-operators-dkz7z" Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.504935 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07-utilities\") pod \"redhat-operators-dkz7z\" (UID: \"1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07\") " pod="openshift-marketplace/redhat-operators-dkz7z" Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.525563 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k556n\" (UniqueName: \"kubernetes.io/projected/1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07-kube-api-access-k556n\") pod \"redhat-operators-dkz7z\" (UID: \"1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07\") " pod="openshift-marketplace/redhat-operators-dkz7z" Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.614407 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dkz7z" Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.650026 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tdfqz"] Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.745143 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-5jvq8" Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.752335 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-5jvq8" Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.977642 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.979438 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.987273 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.987346 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 19 15:19:56 crc kubenswrapper[4771]: I0319 15:19:56.990134 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 19 15:19:57 crc kubenswrapper[4771]: I0319 15:19:57.119263 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74c60410-ed09-4463-9d5d-f9eac4ebc6a5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"74c60410-ed09-4463-9d5d-f9eac4ebc6a5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 15:19:57 crc kubenswrapper[4771]: I0319 15:19:57.119366 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/74c60410-ed09-4463-9d5d-f9eac4ebc6a5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"74c60410-ed09-4463-9d5d-f9eac4ebc6a5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 15:19:57 crc kubenswrapper[4771]: I0319 15:19:57.220715 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74c60410-ed09-4463-9d5d-f9eac4ebc6a5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"74c60410-ed09-4463-9d5d-f9eac4ebc6a5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 15:19:57 crc kubenswrapper[4771]: I0319 15:19:57.222707 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/74c60410-ed09-4463-9d5d-f9eac4ebc6a5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"74c60410-ed09-4463-9d5d-f9eac4ebc6a5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 15:19:57 crc kubenswrapper[4771]: I0319 15:19:57.223571 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/74c60410-ed09-4463-9d5d-f9eac4ebc6a5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"74c60410-ed09-4463-9d5d-f9eac4ebc6a5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 15:19:57 crc kubenswrapper[4771]: I0319 15:19:57.243652 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74c60410-ed09-4463-9d5d-f9eac4ebc6a5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"74c60410-ed09-4463-9d5d-f9eac4ebc6a5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 15:19:57 crc kubenswrapper[4771]: I0319 15:19:57.306874 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 15:19:57 crc kubenswrapper[4771]: I0319 15:19:57.430808 4771 generic.go:334] "Generic (PLEG): container finished" podID="0532d316-51bb-428d-886b-7609738ab65f" containerID="4e4a2f479ba1f8fbe172239ac924ba18a6e4cd66daa681ae65ccf29934789bd1" exitCode=0 Mar 19 15:19:57 crc kubenswrapper[4771]: I0319 15:19:57.431415 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0532d316-51bb-428d-886b-7609738ab65f","Type":"ContainerDied","Data":"4e4a2f479ba1f8fbe172239ac924ba18a6e4cd66daa681ae65ccf29934789bd1"} Mar 19 15:19:57 crc kubenswrapper[4771]: I0319 15:19:57.708417 4771 ???:1] "http: TLS handshake error from 192.168.126.11:48594: no serving certificate available for the kubelet" Mar 19 15:19:58 crc kubenswrapper[4771]: I0319 15:19:58.035617 4771 ???:1] "http: TLS handshake error from 192.168.126.11:48604: no serving certificate available for the kubelet" Mar 19 15:20:00 crc kubenswrapper[4771]: I0319 15:20:00.141924 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565560-5rgwr"] Mar 19 15:20:00 crc kubenswrapper[4771]: I0319 15:20:00.143956 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565560-5rgwr" Mar 19 15:20:00 crc kubenswrapper[4771]: I0319 15:20:00.158477 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k42k7" Mar 19 15:20:00 crc kubenswrapper[4771]: I0319 15:20:00.175396 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565560-5rgwr"] Mar 19 15:20:00 crc kubenswrapper[4771]: I0319 15:20:00.277518 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5fmh\" (UniqueName: \"kubernetes.io/projected/c681a9f8-ad65-46af-b5a2-3ea110cda37f-kube-api-access-w5fmh\") pod \"auto-csr-approver-29565560-5rgwr\" (UID: \"c681a9f8-ad65-46af-b5a2-3ea110cda37f\") " pod="openshift-infra/auto-csr-approver-29565560-5rgwr" Mar 19 15:20:00 crc kubenswrapper[4771]: I0319 15:20:00.379488 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5fmh\" (UniqueName: \"kubernetes.io/projected/c681a9f8-ad65-46af-b5a2-3ea110cda37f-kube-api-access-w5fmh\") pod \"auto-csr-approver-29565560-5rgwr\" (UID: \"c681a9f8-ad65-46af-b5a2-3ea110cda37f\") " pod="openshift-infra/auto-csr-approver-29565560-5rgwr" Mar 19 15:20:00 crc kubenswrapper[4771]: I0319 15:20:00.395815 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5fmh\" (UniqueName: \"kubernetes.io/projected/c681a9f8-ad65-46af-b5a2-3ea110cda37f-kube-api-access-w5fmh\") pod \"auto-csr-approver-29565560-5rgwr\" (UID: \"c681a9f8-ad65-46af-b5a2-3ea110cda37f\") " pod="openshift-infra/auto-csr-approver-29565560-5rgwr" Mar 19 15:20:00 crc kubenswrapper[4771]: I0319 15:20:00.475358 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565560-5rgwr" Mar 19 15:20:01 crc kubenswrapper[4771]: I0319 15:20:01.263704 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hwlpq" Mar 19 15:20:04 crc kubenswrapper[4771]: I0319 15:20:04.935386 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-h97xq" Mar 19 15:20:04 crc kubenswrapper[4771]: I0319 15:20:04.944751 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-h97xq" Mar 19 15:20:05 crc kubenswrapper[4771]: I0319 15:20:05.337956 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-m25b8" Mar 19 15:20:05 crc kubenswrapper[4771]: W0319 15:20:05.760407 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f49e754_62f6_4f17_a6ed_fe5e3abe32b4.slice/crio-00d07be24cd01501f2c49ca3c4cddc06b9772cc0fd4fa70fb6541055b211c326 WatchSource:0}: Error finding container 00d07be24cd01501f2c49ca3c4cddc06b9772cc0fd4fa70fb6541055b211c326: Status 404 returned error can't find the container with id 00d07be24cd01501f2c49ca3c4cddc06b9772cc0fd4fa70fb6541055b211c326 Mar 19 15:20:05 crc kubenswrapper[4771]: I0319 15:20:05.838089 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565555-5tnbp" Mar 19 15:20:05 crc kubenswrapper[4771]: I0319 15:20:05.961613 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 15:20:05 crc kubenswrapper[4771]: I0319 15:20:05.978041 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c846868-7bc7-4aca-b36c-b8e85cc31ac2-secret-volume\") pod \"7c846868-7bc7-4aca-b36c-b8e85cc31ac2\" (UID: \"7c846868-7bc7-4aca-b36c-b8e85cc31ac2\") " Mar 19 15:20:05 crc kubenswrapper[4771]: I0319 15:20:05.978091 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dgpr\" (UniqueName: \"kubernetes.io/projected/7c846868-7bc7-4aca-b36c-b8e85cc31ac2-kube-api-access-9dgpr\") pod \"7c846868-7bc7-4aca-b36c-b8e85cc31ac2\" (UID: \"7c846868-7bc7-4aca-b36c-b8e85cc31ac2\") " Mar 19 15:20:05 crc kubenswrapper[4771]: I0319 15:20:05.978145 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c846868-7bc7-4aca-b36c-b8e85cc31ac2-config-volume\") pod \"7c846868-7bc7-4aca-b36c-b8e85cc31ac2\" (UID: \"7c846868-7bc7-4aca-b36c-b8e85cc31ac2\") " Mar 19 15:20:05 crc kubenswrapper[4771]: I0319 15:20:05.979192 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c846868-7bc7-4aca-b36c-b8e85cc31ac2-config-volume" (OuterVolumeSpecName: "config-volume") pod "7c846868-7bc7-4aca-b36c-b8e85cc31ac2" (UID: "7c846868-7bc7-4aca-b36c-b8e85cc31ac2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:20:05 crc kubenswrapper[4771]: I0319 15:20:05.987207 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c846868-7bc7-4aca-b36c-b8e85cc31ac2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7c846868-7bc7-4aca-b36c-b8e85cc31ac2" (UID: "7c846868-7bc7-4aca-b36c-b8e85cc31ac2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:20:05 crc kubenswrapper[4771]: I0319 15:20:05.988855 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c846868-7bc7-4aca-b36c-b8e85cc31ac2-kube-api-access-9dgpr" (OuterVolumeSpecName: "kube-api-access-9dgpr") pod "7c846868-7bc7-4aca-b36c-b8e85cc31ac2" (UID: "7c846868-7bc7-4aca-b36c-b8e85cc31ac2"). InnerVolumeSpecName "kube-api-access-9dgpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:20:06 crc kubenswrapper[4771]: I0319 15:20:06.079474 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0532d316-51bb-428d-886b-7609738ab65f-kubelet-dir\") pod \"0532d316-51bb-428d-886b-7609738ab65f\" (UID: \"0532d316-51bb-428d-886b-7609738ab65f\") " Mar 19 15:20:06 crc kubenswrapper[4771]: I0319 15:20:06.079946 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0532d316-51bb-428d-886b-7609738ab65f-kube-api-access\") pod \"0532d316-51bb-428d-886b-7609738ab65f\" (UID: \"0532d316-51bb-428d-886b-7609738ab65f\") " Mar 19 15:20:06 crc kubenswrapper[4771]: I0319 15:20:06.079875 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0532d316-51bb-428d-886b-7609738ab65f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0532d316-51bb-428d-886b-7609738ab65f" (UID: "0532d316-51bb-428d-886b-7609738ab65f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:20:06 crc kubenswrapper[4771]: I0319 15:20:06.081406 4771 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0532d316-51bb-428d-886b-7609738ab65f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:06 crc kubenswrapper[4771]: I0319 15:20:06.081584 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c846868-7bc7-4aca-b36c-b8e85cc31ac2-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:06 crc kubenswrapper[4771]: I0319 15:20:06.081720 4771 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c846868-7bc7-4aca-b36c-b8e85cc31ac2-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:06 crc kubenswrapper[4771]: I0319 15:20:06.081848 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dgpr\" (UniqueName: \"kubernetes.io/projected/7c846868-7bc7-4aca-b36c-b8e85cc31ac2-kube-api-access-9dgpr\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:06 crc kubenswrapper[4771]: I0319 15:20:06.084217 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0532d316-51bb-428d-886b-7609738ab65f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0532d316-51bb-428d-886b-7609738ab65f" (UID: "0532d316-51bb-428d-886b-7609738ab65f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:20:06 crc kubenswrapper[4771]: I0319 15:20:06.183734 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0532d316-51bb-428d-886b-7609738ab65f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:06 crc kubenswrapper[4771]: I0319 15:20:06.519307 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tdfqz" event={"ID":"7f49e754-62f6-4f17-a6ed-fe5e3abe32b4","Type":"ContainerStarted","Data":"00d07be24cd01501f2c49ca3c4cddc06b9772cc0fd4fa70fb6541055b211c326"} Mar 19 15:20:06 crc kubenswrapper[4771]: I0319 15:20:06.521566 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565555-5tnbp" event={"ID":"7c846868-7bc7-4aca-b36c-b8e85cc31ac2","Type":"ContainerDied","Data":"a2bf69f4bcf4420f8f65fcac0be1c27067e648779544e8f8a7455249ca08b003"} Mar 19 15:20:06 crc kubenswrapper[4771]: I0319 15:20:06.521614 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565555-5tnbp" Mar 19 15:20:06 crc kubenswrapper[4771]: I0319 15:20:06.521627 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2bf69f4bcf4420f8f65fcac0be1c27067e648779544e8f8a7455249ca08b003" Mar 19 15:20:06 crc kubenswrapper[4771]: I0319 15:20:06.523749 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0532d316-51bb-428d-886b-7609738ab65f","Type":"ContainerDied","Data":"1df489841afbf976c8d1d75baedeb0cabf9159058552af88c1e587e053604d72"} Mar 19 15:20:06 crc kubenswrapper[4771]: I0319 15:20:06.523781 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1df489841afbf976c8d1d75baedeb0cabf9159058552af88c1e587e053604d72" Mar 19 15:20:06 crc kubenswrapper[4771]: I0319 15:20:06.523889 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 15:20:07 crc kubenswrapper[4771]: I0319 15:20:07.440008 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8"] Mar 19 15:20:07 crc kubenswrapper[4771]: I0319 15:20:07.440548 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8" podUID="e51098c3-dbac-473d-ab5e-a3adf6018c28" containerName="controller-manager" containerID="cri-o://8359e5ead9c7aecb1d484b35f501e32a86a8eef5b356325ae85c0458ce55d4a2" gracePeriod=30 Mar 19 15:20:07 crc kubenswrapper[4771]: I0319 15:20:07.458093 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk"] Mar 19 15:20:07 crc kubenswrapper[4771]: I0319 15:20:07.458332 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk" podUID="b28a3071-870c-46a0-ab9a-cf643ab2cb64" containerName="route-controller-manager" containerID="cri-o://3648d2df29aba1b48da9f08be625e2aea187022a5840e768f6152976bf3630aa" gracePeriod=30 Mar 19 15:20:07 crc kubenswrapper[4771]: I0319 15:20:07.982119 4771 ???:1] "http: TLS handshake error from 192.168.126.11:45446: no serving certificate available for the kubelet" Mar 19 15:20:10 crc kubenswrapper[4771]: I0319 15:20:10.551435 4771 generic.go:334] "Generic (PLEG): container finished" podID="e51098c3-dbac-473d-ab5e-a3adf6018c28" containerID="8359e5ead9c7aecb1d484b35f501e32a86a8eef5b356325ae85c0458ce55d4a2" exitCode=0 Mar 19 15:20:10 crc kubenswrapper[4771]: I0319 15:20:10.551766 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8" event={"ID":"e51098c3-dbac-473d-ab5e-a3adf6018c28","Type":"ContainerDied","Data":"8359e5ead9c7aecb1d484b35f501e32a86a8eef5b356325ae85c0458ce55d4a2"} Mar 19 15:20:10 crc kubenswrapper[4771]: I0319 15:20:10.553928 4771 generic.go:334] "Generic (PLEG): container finished" podID="b28a3071-870c-46a0-ab9a-cf643ab2cb64" containerID="3648d2df29aba1b48da9f08be625e2aea187022a5840e768f6152976bf3630aa" exitCode=0 Mar 19 15:20:10 crc kubenswrapper[4771]: I0319 15:20:10.553955 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk" event={"ID":"b28a3071-870c-46a0-ab9a-cf643ab2cb64","Type":"ContainerDied","Data":"3648d2df29aba1b48da9f08be625e2aea187022a5840e768f6152976bf3630aa"} Mar 19 15:20:11 crc kubenswrapper[4771]: I0319 15:20:11.163418 4771 patch_prober.go:28] interesting pod/controller-manager-5f456bfcb7-sx5j8 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 19 15:20:11 crc kubenswrapper[4771]: I0319 15:20:11.163520 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8" podUID="e51098c3-dbac-473d-ab5e-a3adf6018c28" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 19 15:20:13 crc kubenswrapper[4771]: I0319 15:20:13.390689 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 19 15:20:14 crc kubenswrapper[4771]: I0319 15:20:14.955252 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.051966 4771 patch_prober.go:28] interesting pod/route-controller-manager-6845cb6fb-6nflk container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": context deadline exceeded" start-of-body= Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.052040 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk" podUID="b28a3071-870c-46a0-ab9a-cf643ab2cb64" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": context deadline exceeded" Mar 19 15:20:15 crc kubenswrapper[4771]: E0319 15:20:15.110152 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 19 15:20:15 crc kubenswrapper[4771]: E0319 15:20:15.110304 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 15:20:15 crc kubenswrapper[4771]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 19 15:20:15 crc kubenswrapper[4771]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vjb4j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29565558-wvlb8_openshift-infra(af3ce0f9-bc02-4142-8655-9751fe9197db): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 19 15:20:15 crc kubenswrapper[4771]: > logger="UnhandledError" Mar 19 15:20:15 crc kubenswrapper[4771]: E0319 15:20:15.111480 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29565558-wvlb8" podUID="af3ce0f9-bc02-4142-8655-9751fe9197db" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.409806 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.415369 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.462449 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-54b69c9fbf-ll92d"] Mar 19 15:20:15 crc kubenswrapper[4771]: E0319 15:20:15.463046 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b28a3071-870c-46a0-ab9a-cf643ab2cb64" containerName="route-controller-manager" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.463066 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b28a3071-870c-46a0-ab9a-cf643ab2cb64" containerName="route-controller-manager" Mar 19 15:20:15 crc kubenswrapper[4771]: E0319 15:20:15.463090 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c846868-7bc7-4aca-b36c-b8e85cc31ac2" containerName="collect-profiles" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.463097 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c846868-7bc7-4aca-b36c-b8e85cc31ac2" containerName="collect-profiles" Mar 19 15:20:15 crc kubenswrapper[4771]: E0319 15:20:15.463119 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0532d316-51bb-428d-886b-7609738ab65f" containerName="pruner" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.463126 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0532d316-51bb-428d-886b-7609738ab65f" containerName="pruner" Mar 19 15:20:15 crc kubenswrapper[4771]: E0319 15:20:15.463139 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e51098c3-dbac-473d-ab5e-a3adf6018c28" containerName="controller-manager" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.463145 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e51098c3-dbac-473d-ab5e-a3adf6018c28" containerName="controller-manager" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.463886 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c846868-7bc7-4aca-b36c-b8e85cc31ac2" containerName="collect-profiles" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.463942 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0532d316-51bb-428d-886b-7609738ab65f" containerName="pruner" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.463968 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b28a3071-870c-46a0-ab9a-cf643ab2cb64" containerName="route-controller-manager" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.464092 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e51098c3-dbac-473d-ab5e-a3adf6018c28" containerName="controller-manager" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.465581 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54b69c9fbf-ll92d" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.471025 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e51098c3-dbac-473d-ab5e-a3adf6018c28-proxy-ca-bundles\") pod \"e51098c3-dbac-473d-ab5e-a3adf6018c28\" (UID: \"e51098c3-dbac-473d-ab5e-a3adf6018c28\") " Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.471103 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b28a3071-870c-46a0-ab9a-cf643ab2cb64-config\") pod \"b28a3071-870c-46a0-ab9a-cf643ab2cb64\" (UID: \"b28a3071-870c-46a0-ab9a-cf643ab2cb64\") " Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.471132 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d24zg\" (UniqueName: \"kubernetes.io/projected/b28a3071-870c-46a0-ab9a-cf643ab2cb64-kube-api-access-d24zg\") pod \"b28a3071-870c-46a0-ab9a-cf643ab2cb64\" (UID: \"b28a3071-870c-46a0-ab9a-cf643ab2cb64\") " Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.471221 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xskff\" (UniqueName: \"kubernetes.io/projected/e51098c3-dbac-473d-ab5e-a3adf6018c28-kube-api-access-xskff\") pod \"e51098c3-dbac-473d-ab5e-a3adf6018c28\" (UID: \"e51098c3-dbac-473d-ab5e-a3adf6018c28\") " Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.471260 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b28a3071-870c-46a0-ab9a-cf643ab2cb64-client-ca\") pod \"b28a3071-870c-46a0-ab9a-cf643ab2cb64\" (UID: \"b28a3071-870c-46a0-ab9a-cf643ab2cb64\") " Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.471293 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e51098c3-dbac-473d-ab5e-a3adf6018c28-client-ca\") pod \"e51098c3-dbac-473d-ab5e-a3adf6018c28\" (UID: \"e51098c3-dbac-473d-ab5e-a3adf6018c28\") " Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.471350 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e51098c3-dbac-473d-ab5e-a3adf6018c28-config\") pod \"e51098c3-dbac-473d-ab5e-a3adf6018c28\" (UID: \"e51098c3-dbac-473d-ab5e-a3adf6018c28\") " Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.471383 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b28a3071-870c-46a0-ab9a-cf643ab2cb64-serving-cert\") pod \"b28a3071-870c-46a0-ab9a-cf643ab2cb64\" (UID: \"b28a3071-870c-46a0-ab9a-cf643ab2cb64\") " Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.471411 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e51098c3-dbac-473d-ab5e-a3adf6018c28-serving-cert\") pod \"e51098c3-dbac-473d-ab5e-a3adf6018c28\" (UID: \"e51098c3-dbac-473d-ab5e-a3adf6018c28\") " Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.471673 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54b69c9fbf-ll92d"] Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.473226 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e51098c3-dbac-473d-ab5e-a3adf6018c28-client-ca" (OuterVolumeSpecName: "client-ca") pod "e51098c3-dbac-473d-ab5e-a3adf6018c28" (UID: "e51098c3-dbac-473d-ab5e-a3adf6018c28"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.474243 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b28a3071-870c-46a0-ab9a-cf643ab2cb64-client-ca" (OuterVolumeSpecName: "client-ca") pod "b28a3071-870c-46a0-ab9a-cf643ab2cb64" (UID: "b28a3071-870c-46a0-ab9a-cf643ab2cb64"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.474807 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b28a3071-870c-46a0-ab9a-cf643ab2cb64-config" (OuterVolumeSpecName: "config") pod "b28a3071-870c-46a0-ab9a-cf643ab2cb64" (UID: "b28a3071-870c-46a0-ab9a-cf643ab2cb64"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.475536 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e51098c3-dbac-473d-ab5e-a3adf6018c28-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e51098c3-dbac-473d-ab5e-a3adf6018c28" (UID: "e51098c3-dbac-473d-ab5e-a3adf6018c28"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.476143 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e51098c3-dbac-473d-ab5e-a3adf6018c28-config" (OuterVolumeSpecName: "config") pod "e51098c3-dbac-473d-ab5e-a3adf6018c28" (UID: "e51098c3-dbac-473d-ab5e-a3adf6018c28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.481606 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b28a3071-870c-46a0-ab9a-cf643ab2cb64-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b28a3071-870c-46a0-ab9a-cf643ab2cb64" (UID: "b28a3071-870c-46a0-ab9a-cf643ab2cb64"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.492093 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e51098c3-dbac-473d-ab5e-a3adf6018c28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e51098c3-dbac-473d-ab5e-a3adf6018c28" (UID: "e51098c3-dbac-473d-ab5e-a3adf6018c28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.492355 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b28a3071-870c-46a0-ab9a-cf643ab2cb64-kube-api-access-d24zg" (OuterVolumeSpecName: "kube-api-access-d24zg") pod "b28a3071-870c-46a0-ab9a-cf643ab2cb64" (UID: "b28a3071-870c-46a0-ab9a-cf643ab2cb64"). InnerVolumeSpecName "kube-api-access-d24zg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.494141 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e51098c3-dbac-473d-ab5e-a3adf6018c28-kube-api-access-xskff" (OuterVolumeSpecName: "kube-api-access-xskff") pod "e51098c3-dbac-473d-ab5e-a3adf6018c28" (UID: "e51098c3-dbac-473d-ab5e-a3adf6018c28"). InnerVolumeSpecName "kube-api-access-xskff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.572648 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nll8v\" (UniqueName: \"kubernetes.io/projected/9355fee9-a66a-4e7d-b903-92cefb7c193a-kube-api-access-nll8v\") pod \"controller-manager-54b69c9fbf-ll92d\" (UID: \"9355fee9-a66a-4e7d-b903-92cefb7c193a\") " pod="openshift-controller-manager/controller-manager-54b69c9fbf-ll92d" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.572703 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9355fee9-a66a-4e7d-b903-92cefb7c193a-config\") pod \"controller-manager-54b69c9fbf-ll92d\" (UID: \"9355fee9-a66a-4e7d-b903-92cefb7c193a\") " pod="openshift-controller-manager/controller-manager-54b69c9fbf-ll92d" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.572971 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9355fee9-a66a-4e7d-b903-92cefb7c193a-client-ca\") pod \"controller-manager-54b69c9fbf-ll92d\" (UID: \"9355fee9-a66a-4e7d-b903-92cefb7c193a\") " pod="openshift-controller-manager/controller-manager-54b69c9fbf-ll92d" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.573094 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9355fee9-a66a-4e7d-b903-92cefb7c193a-proxy-ca-bundles\") pod \"controller-manager-54b69c9fbf-ll92d\" (UID: \"9355fee9-a66a-4e7d-b903-92cefb7c193a\") " pod="openshift-controller-manager/controller-manager-54b69c9fbf-ll92d" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.573154 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9355fee9-a66a-4e7d-b903-92cefb7c193a-serving-cert\") pod \"controller-manager-54b69c9fbf-ll92d\" (UID: \"9355fee9-a66a-4e7d-b903-92cefb7c193a\") " pod="openshift-controller-manager/controller-manager-54b69c9fbf-ll92d" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.573322 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e51098c3-dbac-473d-ab5e-a3adf6018c28-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.573345 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b28a3071-870c-46a0-ab9a-cf643ab2cb64-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.573359 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e51098c3-dbac-473d-ab5e-a3adf6018c28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.573370 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e51098c3-dbac-473d-ab5e-a3adf6018c28-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.573384 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b28a3071-870c-46a0-ab9a-cf643ab2cb64-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.573396 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d24zg\" (UniqueName: \"kubernetes.io/projected/b28a3071-870c-46a0-ab9a-cf643ab2cb64-kube-api-access-d24zg\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.573407 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xskff\" (UniqueName: \"kubernetes.io/projected/e51098c3-dbac-473d-ab5e-a3adf6018c28-kube-api-access-xskff\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.573417 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b28a3071-870c-46a0-ab9a-cf643ab2cb64-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.573427 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e51098c3-dbac-473d-ab5e-a3adf6018c28-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.674440 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9355fee9-a66a-4e7d-b903-92cefb7c193a-proxy-ca-bundles\") pod \"controller-manager-54b69c9fbf-ll92d\" (UID: \"9355fee9-a66a-4e7d-b903-92cefb7c193a\") " pod="openshift-controller-manager/controller-manager-54b69c9fbf-ll92d" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.674491 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9355fee9-a66a-4e7d-b903-92cefb7c193a-serving-cert\") pod \"controller-manager-54b69c9fbf-ll92d\" (UID: \"9355fee9-a66a-4e7d-b903-92cefb7c193a\") " pod="openshift-controller-manager/controller-manager-54b69c9fbf-ll92d" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.674578 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nll8v\" (UniqueName: \"kubernetes.io/projected/9355fee9-a66a-4e7d-b903-92cefb7c193a-kube-api-access-nll8v\") pod \"controller-manager-54b69c9fbf-ll92d\" (UID: \"9355fee9-a66a-4e7d-b903-92cefb7c193a\") " pod="openshift-controller-manager/controller-manager-54b69c9fbf-ll92d" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.674629 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9355fee9-a66a-4e7d-b903-92cefb7c193a-config\") pod \"controller-manager-54b69c9fbf-ll92d\" (UID: \"9355fee9-a66a-4e7d-b903-92cefb7c193a\") " pod="openshift-controller-manager/controller-manager-54b69c9fbf-ll92d" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.674695 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9355fee9-a66a-4e7d-b903-92cefb7c193a-client-ca\") pod \"controller-manager-54b69c9fbf-ll92d\" (UID: \"9355fee9-a66a-4e7d-b903-92cefb7c193a\") " pod="openshift-controller-manager/controller-manager-54b69c9fbf-ll92d" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.675793 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9355fee9-a66a-4e7d-b903-92cefb7c193a-client-ca\") pod \"controller-manager-54b69c9fbf-ll92d\" (UID: \"9355fee9-a66a-4e7d-b903-92cefb7c193a\") " pod="openshift-controller-manager/controller-manager-54b69c9fbf-ll92d" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.678146 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9355fee9-a66a-4e7d-b903-92cefb7c193a-config\") pod \"controller-manager-54b69c9fbf-ll92d\" (UID: \"9355fee9-a66a-4e7d-b903-92cefb7c193a\") " pod="openshift-controller-manager/controller-manager-54b69c9fbf-ll92d" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.678378 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9355fee9-a66a-4e7d-b903-92cefb7c193a-proxy-ca-bundles\") pod \"controller-manager-54b69c9fbf-ll92d\" (UID: \"9355fee9-a66a-4e7d-b903-92cefb7c193a\") " pod="openshift-controller-manager/controller-manager-54b69c9fbf-ll92d" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.679380 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9355fee9-a66a-4e7d-b903-92cefb7c193a-serving-cert\") pod \"controller-manager-54b69c9fbf-ll92d\" (UID: \"9355fee9-a66a-4e7d-b903-92cefb7c193a\") " pod="openshift-controller-manager/controller-manager-54b69c9fbf-ll92d" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.697832 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nll8v\" (UniqueName: \"kubernetes.io/projected/9355fee9-a66a-4e7d-b903-92cefb7c193a-kube-api-access-nll8v\") pod \"controller-manager-54b69c9fbf-ll92d\" (UID: \"9355fee9-a66a-4e7d-b903-92cefb7c193a\") " pod="openshift-controller-manager/controller-manager-54b69c9fbf-ll92d" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.826823 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54b69c9fbf-ll92d" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.985790 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"74c60410-ed09-4463-9d5d-f9eac4ebc6a5","Type":"ContainerStarted","Data":"fe5c15255cb5150789343a4d576ff762ffe473c95ed019a942962c0f5f13f2cd"} Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.988098 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8" event={"ID":"e51098c3-dbac-473d-ab5e-a3adf6018c28","Type":"ContainerDied","Data":"82b18d4d68cd50b57472ab1b54e9d5857c39e22aabd0814f7a36aa8c7829ae69"} Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.988264 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.988348 4771 scope.go:117] "RemoveContainer" containerID="8359e5ead9c7aecb1d484b35f501e32a86a8eef5b356325ae85c0458ce55d4a2" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.994464 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk" Mar 19 15:20:15 crc kubenswrapper[4771]: I0319 15:20:15.994974 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk" event={"ID":"b28a3071-870c-46a0-ab9a-cf643ab2cb64","Type":"ContainerDied","Data":"396786c735d77e9cafb68efb343c576f2873eb506ae67ab9692cb6bf83fbc874"} Mar 19 15:20:15 crc kubenswrapper[4771]: E0319 15:20:15.995913 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29565558-wvlb8" podUID="af3ce0f9-bc02-4142-8655-9751fe9197db" Mar 19 15:20:16 crc kubenswrapper[4771]: I0319 15:20:16.029382 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8"] Mar 19 15:20:16 crc kubenswrapper[4771]: I0319 15:20:16.034360 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5f456bfcb7-sx5j8"] Mar 19 15:20:16 crc kubenswrapper[4771]: I0319 15:20:16.038141 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk"] Mar 19 15:20:16 crc kubenswrapper[4771]: I0319 15:20:16.040562 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6845cb6fb-6nflk"] Mar 19 15:20:17 crc kubenswrapper[4771]: I0319 15:20:17.516556 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b28a3071-870c-46a0-ab9a-cf643ab2cb64" path="/var/lib/kubelet/pods/b28a3071-870c-46a0-ab9a-cf643ab2cb64/volumes" Mar 19 15:20:17 crc kubenswrapper[4771]: I0319 15:20:17.517469 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e51098c3-dbac-473d-ab5e-a3adf6018c28" path="/var/lib/kubelet/pods/e51098c3-dbac-473d-ab5e-a3adf6018c28/volumes" Mar 19 15:20:17 crc kubenswrapper[4771]: I0319 15:20:17.968712 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4"] Mar 19 15:20:17 crc kubenswrapper[4771]: I0319 15:20:17.970062 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4" Mar 19 15:20:17 crc kubenswrapper[4771]: I0319 15:20:17.975737 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 15:20:17 crc kubenswrapper[4771]: I0319 15:20:17.977269 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 15:20:17 crc kubenswrapper[4771]: I0319 15:20:17.977946 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 15:20:17 crc kubenswrapper[4771]: I0319 15:20:17.978129 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 15:20:17 crc kubenswrapper[4771]: I0319 15:20:17.978504 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 15:20:17 crc kubenswrapper[4771]: I0319 15:20:17.978724 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 15:20:17 crc kubenswrapper[4771]: I0319 15:20:17.980767 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4"] Mar 19 15:20:18 crc kubenswrapper[4771]: I0319 15:20:18.005917 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8718c475-3f85-4923-9d02-e569281b9d5a-client-ca\") pod \"route-controller-manager-b4b4bcb8f-5vrf4\" (UID: \"8718c475-3f85-4923-9d02-e569281b9d5a\") " pod="openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4" Mar 19 15:20:18 crc kubenswrapper[4771]: I0319 15:20:18.006016 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8718c475-3f85-4923-9d02-e569281b9d5a-serving-cert\") pod \"route-controller-manager-b4b4bcb8f-5vrf4\" (UID: \"8718c475-3f85-4923-9d02-e569281b9d5a\") " pod="openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4" Mar 19 15:20:18 crc kubenswrapper[4771]: I0319 15:20:18.006073 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8718c475-3f85-4923-9d02-e569281b9d5a-config\") pod \"route-controller-manager-b4b4bcb8f-5vrf4\" (UID: \"8718c475-3f85-4923-9d02-e569281b9d5a\") " pod="openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4" Mar 19 15:20:18 crc kubenswrapper[4771]: I0319 15:20:18.006136 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl56f\" (UniqueName: \"kubernetes.io/projected/8718c475-3f85-4923-9d02-e569281b9d5a-kube-api-access-vl56f\") pod \"route-controller-manager-b4b4bcb8f-5vrf4\" (UID: \"8718c475-3f85-4923-9d02-e569281b9d5a\") " pod="openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4" Mar 19 15:20:18 crc kubenswrapper[4771]: I0319 15:20:18.106937 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8718c475-3f85-4923-9d02-e569281b9d5a-client-ca\") pod \"route-controller-manager-b4b4bcb8f-5vrf4\" (UID: \"8718c475-3f85-4923-9d02-e569281b9d5a\") " pod="openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4" Mar 19 15:20:18 crc kubenswrapper[4771]: I0319 15:20:18.107041 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8718c475-3f85-4923-9d02-e569281b9d5a-serving-cert\") pod \"route-controller-manager-b4b4bcb8f-5vrf4\" (UID: \"8718c475-3f85-4923-9d02-e569281b9d5a\") " pod="openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4" Mar 19 15:20:18 crc kubenswrapper[4771]: I0319 15:20:18.107082 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8718c475-3f85-4923-9d02-e569281b9d5a-config\") pod \"route-controller-manager-b4b4bcb8f-5vrf4\" (UID: \"8718c475-3f85-4923-9d02-e569281b9d5a\") " pod="openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4" Mar 19 15:20:18 crc kubenswrapper[4771]: I0319 15:20:18.107128 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl56f\" (UniqueName: \"kubernetes.io/projected/8718c475-3f85-4923-9d02-e569281b9d5a-kube-api-access-vl56f\") pod \"route-controller-manager-b4b4bcb8f-5vrf4\" (UID: \"8718c475-3f85-4923-9d02-e569281b9d5a\") " pod="openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4" Mar 19 15:20:18 crc kubenswrapper[4771]: I0319 15:20:18.108522 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8718c475-3f85-4923-9d02-e569281b9d5a-config\") pod \"route-controller-manager-b4b4bcb8f-5vrf4\" (UID: \"8718c475-3f85-4923-9d02-e569281b9d5a\") " pod="openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4" Mar 19 15:20:18 crc kubenswrapper[4771]: I0319 15:20:18.108821 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8718c475-3f85-4923-9d02-e569281b9d5a-client-ca\") pod \"route-controller-manager-b4b4bcb8f-5vrf4\" (UID: \"8718c475-3f85-4923-9d02-e569281b9d5a\") " pod="openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4" Mar 19 15:20:18 crc kubenswrapper[4771]: I0319 15:20:18.111575 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8718c475-3f85-4923-9d02-e569281b9d5a-serving-cert\") pod \"route-controller-manager-b4b4bcb8f-5vrf4\" (UID: \"8718c475-3f85-4923-9d02-e569281b9d5a\") " pod="openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4" Mar 19 15:20:18 crc kubenswrapper[4771]: I0319 15:20:18.127624 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl56f\" (UniqueName: \"kubernetes.io/projected/8718c475-3f85-4923-9d02-e569281b9d5a-kube-api-access-vl56f\") pod \"route-controller-manager-b4b4bcb8f-5vrf4\" (UID: \"8718c475-3f85-4923-9d02-e569281b9d5a\") " pod="openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4" Mar 19 15:20:18 crc kubenswrapper[4771]: I0319 15:20:18.292514 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4" Mar 19 15:20:21 crc kubenswrapper[4771]: E0319 15:20:21.236318 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 19 15:20:21 crc kubenswrapper[4771]: E0319 15:20:21.237068 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dl5fw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-4fmgq_openshift-marketplace(ae9495d8-bbe9-4f54-8c12-56b9f40530e1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 15:20:21 crc kubenswrapper[4771]: E0319 15:20:21.239076 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-4fmgq" podUID="ae9495d8-bbe9-4f54-8c12-56b9f40530e1" Mar 19 15:20:21 crc kubenswrapper[4771]: I0319 15:20:21.373921 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dkz7z"] Mar 19 15:20:21 crc kubenswrapper[4771]: I0319 15:20:21.624215 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565560-5rgwr"] Mar 19 15:20:22 crc kubenswrapper[4771]: I0319 15:20:22.032867 4771 generic.go:334] "Generic (PLEG): container finished" podID="7f49e754-62f6-4f17-a6ed-fe5e3abe32b4" containerID="c58002d121f59cc75a38d441a65c463656e0e0a6b35ceee40ba6536d90a70b39" exitCode=0 Mar 19 15:20:22 crc kubenswrapper[4771]: I0319 15:20:22.032926 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tdfqz" event={"ID":"7f49e754-62f6-4f17-a6ed-fe5e3abe32b4","Type":"ContainerDied","Data":"c58002d121f59cc75a38d441a65c463656e0e0a6b35ceee40ba6536d90a70b39"} Mar 19 15:20:22 crc kubenswrapper[4771]: E0319 15:20:22.954831 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-4fmgq" podUID="ae9495d8-bbe9-4f54-8c12-56b9f40530e1" Mar 19 15:20:23 crc kubenswrapper[4771]: I0319 15:20:23.028090 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:20:23 crc kubenswrapper[4771]: I0319 15:20:23.028210 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:20:23 crc kubenswrapper[4771]: W0319 15:20:23.246190 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc681a9f8_ad65_46af_b5a2_3ea110cda37f.slice/crio-2c911b76c113583fe798f561ea0bcc4a56a1f8066fc585375294cdf9107832f4 WatchSource:0}: Error finding container 2c911b76c113583fe798f561ea0bcc4a56a1f8066fc585375294cdf9107832f4: Status 404 returned error can't find the container with id 2c911b76c113583fe798f561ea0bcc4a56a1f8066fc585375294cdf9107832f4 Mar 19 15:20:24 crc kubenswrapper[4771]: I0319 15:20:24.050288 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dkz7z" event={"ID":"1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07","Type":"ContainerStarted","Data":"a1728887f917bf4394aba75df6e60e54b6c80a605411f4268cde7757b084a068"} Mar 19 15:20:24 crc kubenswrapper[4771]: I0319 15:20:24.051471 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565560-5rgwr" event={"ID":"c681a9f8-ad65-46af-b5a2-3ea110cda37f","Type":"ContainerStarted","Data":"2c911b76c113583fe798f561ea0bcc4a56a1f8066fc585375294cdf9107832f4"} Mar 19 15:20:25 crc kubenswrapper[4771]: I0319 15:20:25.182052 4771 scope.go:117] "RemoveContainer" containerID="3648d2df29aba1b48da9f08be625e2aea187022a5840e768f6152976bf3630aa" Mar 19 15:20:25 crc kubenswrapper[4771]: E0319 15:20:25.190010 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 19 15:20:25 crc kubenswrapper[4771]: E0319 15:20:25.190160 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b68ws,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-r6j98_openshift-marketplace(6d94fc0b-9a51-41a1-b346-767fa239b631): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 15:20:25 crc kubenswrapper[4771]: E0319 15:20:25.191282 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-r6j98" podUID="6d94fc0b-9a51-41a1-b346-767fa239b631" Mar 19 15:20:25 crc kubenswrapper[4771]: I0319 15:20:25.579739 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54b69c9fbf-ll92d"] Mar 19 15:20:25 crc kubenswrapper[4771]: W0319 15:20:25.607274 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9355fee9_a66a_4e7d_b903_92cefb7c193a.slice/crio-d13a467903898688ac022f1a78759c3092496cb2084f123f9b0fe5a869758736 WatchSource:0}: Error finding container d13a467903898688ac022f1a78759c3092496cb2084f123f9b0fe5a869758736: Status 404 returned error can't find the container with id d13a467903898688ac022f1a78759c3092496cb2084f123f9b0fe5a869758736 Mar 19 15:20:25 crc kubenswrapper[4771]: I0319 15:20:25.633240 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4"] Mar 19 15:20:25 crc kubenswrapper[4771]: W0319 15:20:25.676400 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8718c475_3f85_4923_9d02_e569281b9d5a.slice/crio-d1981ca87fe984e0d4b470033e0f8ae8e71ee00fcc02c80a0f6355e0d21a5f69 WatchSource:0}: Error finding container d1981ca87fe984e0d4b470033e0f8ae8e71ee00fcc02c80a0f6355e0d21a5f69: Status 404 returned error can't find the container with id d1981ca87fe984e0d4b470033e0f8ae8e71ee00fcc02c80a0f6355e0d21a5f69 Mar 19 15:20:26 crc kubenswrapper[4771]: I0319 15:20:26.048224 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rdqrx" Mar 19 15:20:26 crc kubenswrapper[4771]: I0319 15:20:26.068661 4771 generic.go:334] "Generic (PLEG): container finished" podID="b49408ed-5087-4cb2-b70e-391c32aad069" containerID="fa3e2911a377705e179236637ff3e39ad003b3198a9a168a7c42cd1169c41f38" exitCode=0 Mar 19 15:20:26 crc kubenswrapper[4771]: I0319 15:20:26.068760 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlzb8" event={"ID":"b49408ed-5087-4cb2-b70e-391c32aad069","Type":"ContainerDied","Data":"fa3e2911a377705e179236637ff3e39ad003b3198a9a168a7c42cd1169c41f38"} Mar 19 15:20:26 crc kubenswrapper[4771]: I0319 15:20:26.070731 4771 generic.go:334] "Generic (PLEG): container finished" podID="cc023f86-d23c-4ca1-810a-de7ece9bb340" containerID="c01334afb4bbbb050969cebde0e2b368a27f1fec2c7f8cac60f1df629a49f237" exitCode=0 Mar 19 15:20:26 crc kubenswrapper[4771]: I0319 15:20:26.070839 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7zl5" event={"ID":"cc023f86-d23c-4ca1-810a-de7ece9bb340","Type":"ContainerDied","Data":"c01334afb4bbbb050969cebde0e2b368a27f1fec2c7f8cac60f1df629a49f237"} Mar 19 15:20:26 crc kubenswrapper[4771]: I0319 15:20:26.072245 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4" event={"ID":"8718c475-3f85-4923-9d02-e569281b9d5a","Type":"ContainerStarted","Data":"fd185355ace1ec625b3c68f39d7dc70a68fc4f8a70b26723cf5a2b1d28be4b7e"} Mar 19 15:20:26 crc kubenswrapper[4771]: I0319 15:20:26.072294 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4" event={"ID":"8718c475-3f85-4923-9d02-e569281b9d5a","Type":"ContainerStarted","Data":"d1981ca87fe984e0d4b470033e0f8ae8e71ee00fcc02c80a0f6355e0d21a5f69"} Mar 19 15:20:26 crc kubenswrapper[4771]: I0319 15:20:26.072961 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4" Mar 19 15:20:26 crc kubenswrapper[4771]: I0319 15:20:26.073914 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54b69c9fbf-ll92d" event={"ID":"9355fee9-a66a-4e7d-b903-92cefb7c193a","Type":"ContainerStarted","Data":"6172d78881ae2623107bdb0f6ac52a55221a90209674453fc55832e61373fee4"} Mar 19 15:20:26 crc kubenswrapper[4771]: I0319 15:20:26.073941 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54b69c9fbf-ll92d" event={"ID":"9355fee9-a66a-4e7d-b903-92cefb7c193a","Type":"ContainerStarted","Data":"d13a467903898688ac022f1a78759c3092496cb2084f123f9b0fe5a869758736"} Mar 19 15:20:26 crc kubenswrapper[4771]: I0319 15:20:26.074325 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-54b69c9fbf-ll92d" Mar 19 15:20:26 crc kubenswrapper[4771]: I0319 15:20:26.080013 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmtnc" event={"ID":"65b4c9ce-e8af-4eca-abf5-08149432aaa5","Type":"ContainerDied","Data":"ad955d7eb09d1b517073fce8e813d1e788d8d6f608a543b930b4bffe4d1c99ff"} Mar 19 15:20:26 crc kubenswrapper[4771]: I0319 15:20:26.079375 4771 generic.go:334] "Generic (PLEG): container finished" podID="65b4c9ce-e8af-4eca-abf5-08149432aaa5" containerID="ad955d7eb09d1b517073fce8e813d1e788d8d6f608a543b930b4bffe4d1c99ff" exitCode=0 Mar 19 15:20:26 crc kubenswrapper[4771]: I0319 15:20:26.090056 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"74c60410-ed09-4463-9d5d-f9eac4ebc6a5","Type":"ContainerStarted","Data":"1f9ffd0b408b41745aeb61b18f8f5a145cf9530252b25b3720396ef913fdbf15"} Mar 19 15:20:26 crc kubenswrapper[4771]: I0319 15:20:26.095559 4771 generic.go:334] "Generic (PLEG): container finished" podID="1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07" containerID="74a3f008345226ac855c1ffd682e82661be95a19d1e5a9259ef4c667d494013f" exitCode=0 Mar 19 15:20:26 crc kubenswrapper[4771]: I0319 15:20:26.096188 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dkz7z" event={"ID":"1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07","Type":"ContainerDied","Data":"74a3f008345226ac855c1ffd682e82661be95a19d1e5a9259ef4c667d494013f"} Mar 19 15:20:26 crc kubenswrapper[4771]: I0319 15:20:26.112823 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-54b69c9fbf-ll92d" Mar 19 15:20:26 crc kubenswrapper[4771]: I0319 15:20:26.113903 4771 generic.go:334] "Generic (PLEG): container finished" podID="9b59b88d-6b8a-43dd-ae40-3091d533b8ae" containerID="b3568afb07849a298f2c1b0ac41b3b25c0fd2ddd179cffef2413b1b5c776c28e" exitCode=0 Mar 19 15:20:26 crc kubenswrapper[4771]: I0319 15:20:26.114004 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-566qg" event={"ID":"9b59b88d-6b8a-43dd-ae40-3091d533b8ae","Type":"ContainerDied","Data":"b3568afb07849a298f2c1b0ac41b3b25c0fd2ddd179cffef2413b1b5c776c28e"} Mar 19 15:20:26 crc kubenswrapper[4771]: E0319 15:20:26.142283 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-r6j98" podUID="6d94fc0b-9a51-41a1-b346-767fa239b631" Mar 19 15:20:26 crc kubenswrapper[4771]: I0319 15:20:26.194713 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-54b69c9fbf-ll92d" podStartSLOduration=19.194677721 podStartE2EDuration="19.194677721s" podCreationTimestamp="2026-03-19 15:20:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:20:26.194170358 +0000 UTC m=+285.422791550" watchObservedRunningTime="2026-03-19 15:20:26.194677721 +0000 UTC m=+285.423298923" Mar 19 15:20:26 crc kubenswrapper[4771]: I0319 15:20:26.244043 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4" podStartSLOduration=19.244020925 podStartE2EDuration="19.244020925s" podCreationTimestamp="2026-03-19 15:20:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:20:26.241316935 +0000 UTC m=+285.469938137" watchObservedRunningTime="2026-03-19 15:20:26.244020925 +0000 UTC m=+285.472642127" Mar 19 15:20:26 crc kubenswrapper[4771]: I0319 15:20:26.276909 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4" Mar 19 15:20:26 crc kubenswrapper[4771]: I0319 15:20:26.387543 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=30.387505002 podStartE2EDuration="30.387505002s" podCreationTimestamp="2026-03-19 15:19:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:20:26.381912176 +0000 UTC m=+285.610533378" watchObservedRunningTime="2026-03-19 15:20:26.387505002 +0000 UTC m=+285.616126214" Mar 19 15:20:27 crc kubenswrapper[4771]: I0319 15:20:27.144609 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmtnc" event={"ID":"65b4c9ce-e8af-4eca-abf5-08149432aaa5","Type":"ContainerStarted","Data":"615ba86dda50e86dfefc4ffe0568bc4bf2fee970daa80e02f4fa0d13f9ceb562"} Mar 19 15:20:27 crc kubenswrapper[4771]: I0319 15:20:27.149343 4771 generic.go:334] "Generic (PLEG): container finished" podID="74c60410-ed09-4463-9d5d-f9eac4ebc6a5" containerID="1f9ffd0b408b41745aeb61b18f8f5a145cf9530252b25b3720396ef913fdbf15" exitCode=0 Mar 19 15:20:27 crc kubenswrapper[4771]: I0319 15:20:27.149440 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"74c60410-ed09-4463-9d5d-f9eac4ebc6a5","Type":"ContainerDied","Data":"1f9ffd0b408b41745aeb61b18f8f5a145cf9530252b25b3720396ef913fdbf15"} Mar 19 15:20:27 crc kubenswrapper[4771]: I0319 15:20:27.165008 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dmtnc" podStartSLOduration=2.604472299 podStartE2EDuration="33.164963837s" podCreationTimestamp="2026-03-19 15:19:54 +0000 UTC" firstStartedPulling="2026-03-19 15:19:56.429303248 +0000 UTC m=+255.657924450" lastFinishedPulling="2026-03-19 15:20:26.989794786 +0000 UTC m=+286.218415988" observedRunningTime="2026-03-19 15:20:27.160679615 +0000 UTC m=+286.389300837" watchObservedRunningTime="2026-03-19 15:20:27.164963837 +0000 UTC m=+286.393585029" Mar 19 15:20:27 crc kubenswrapper[4771]: I0319 15:20:27.426869 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54b69c9fbf-ll92d"] Mar 19 15:20:27 crc kubenswrapper[4771]: I0319 15:20:27.522961 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4"] Mar 19 15:20:27 crc kubenswrapper[4771]: I0319 15:20:27.757716 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 19 15:20:27 crc kubenswrapper[4771]: I0319 15:20:27.759038 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 15:20:27 crc kubenswrapper[4771]: I0319 15:20:27.761343 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 15:20:27 crc kubenswrapper[4771]: I0319 15:20:27.762002 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 19 15:20:27 crc kubenswrapper[4771]: I0319 15:20:27.768031 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 19 15:20:27 crc kubenswrapper[4771]: I0319 15:20:27.951279 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/326c866a-522f-4c4c-91ee-7225c1a7d537-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"326c866a-522f-4c4c-91ee-7225c1a7d537\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 15:20:27 crc kubenswrapper[4771]: I0319 15:20:27.951346 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/326c866a-522f-4c4c-91ee-7225c1a7d537-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"326c866a-522f-4c4c-91ee-7225c1a7d537\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 15:20:28 crc kubenswrapper[4771]: I0319 15:20:28.052141 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/326c866a-522f-4c4c-91ee-7225c1a7d537-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"326c866a-522f-4c4c-91ee-7225c1a7d537\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 15:20:28 crc kubenswrapper[4771]: I0319 15:20:28.052295 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/326c866a-522f-4c4c-91ee-7225c1a7d537-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"326c866a-522f-4c4c-91ee-7225c1a7d537\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 15:20:28 crc kubenswrapper[4771]: I0319 15:20:28.052374 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/326c866a-522f-4c4c-91ee-7225c1a7d537-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"326c866a-522f-4c4c-91ee-7225c1a7d537\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 15:20:28 crc kubenswrapper[4771]: I0319 15:20:28.072717 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/326c866a-522f-4c4c-91ee-7225c1a7d537-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"326c866a-522f-4c4c-91ee-7225c1a7d537\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 15:20:28 crc kubenswrapper[4771]: I0319 15:20:28.079295 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 15:20:28 crc kubenswrapper[4771]: I0319 15:20:28.162729 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-566qg" event={"ID":"9b59b88d-6b8a-43dd-ae40-3091d533b8ae","Type":"ContainerStarted","Data":"a4b6377fa7de1d216b7e77a11f725bfef1a75cbc9fea07947b9174e8648bd060"} Mar 19 15:20:28 crc kubenswrapper[4771]: I0319 15:20:28.168595 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlzb8" event={"ID":"b49408ed-5087-4cb2-b70e-391c32aad069","Type":"ContainerStarted","Data":"188cd9b03d1d760e8865718d9f7b0c60aa53c72f79601b7a5ac85307d89b2810"} Mar 19 15:20:28 crc kubenswrapper[4771]: I0319 15:20:28.172933 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7zl5" event={"ID":"cc023f86-d23c-4ca1-810a-de7ece9bb340","Type":"ContainerStarted","Data":"b4d78c3918ca82b3e6993e1c2d6ee7453ad52bc1505b5abbfc8d19c0350e24d7"} Mar 19 15:20:28 crc kubenswrapper[4771]: I0319 15:20:28.189855 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-566qg" podStartSLOduration=3.337361124 podStartE2EDuration="35.189761833s" podCreationTimestamp="2026-03-19 15:19:53 +0000 UTC" firstStartedPulling="2026-03-19 15:19:55.423745043 +0000 UTC m=+254.652366245" lastFinishedPulling="2026-03-19 15:20:27.276145752 +0000 UTC m=+286.504766954" observedRunningTime="2026-03-19 15:20:28.189150906 +0000 UTC m=+287.417772118" watchObservedRunningTime="2026-03-19 15:20:28.189761833 +0000 UTC m=+287.418383035" Mar 19 15:20:28 crc kubenswrapper[4771]: I0319 15:20:28.246892 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g7zl5" podStartSLOduration=2.55861964 podStartE2EDuration="33.24686091s" podCreationTimestamp="2026-03-19 15:19:55 +0000 UTC" firstStartedPulling="2026-03-19 15:19:56.422352281 +0000 UTC m=+255.650973483" lastFinishedPulling="2026-03-19 15:20:27.110593551 +0000 UTC m=+286.339214753" observedRunningTime="2026-03-19 15:20:28.217577167 +0000 UTC m=+287.446198369" watchObservedRunningTime="2026-03-19 15:20:28.24686091 +0000 UTC m=+287.475482112" Mar 19 15:20:28 crc kubenswrapper[4771]: I0319 15:20:28.526866 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rlzb8" podStartSLOduration=3.752208761 podStartE2EDuration="36.526845871s" podCreationTimestamp="2026-03-19 15:19:52 +0000 UTC" firstStartedPulling="2026-03-19 15:19:54.308547468 +0000 UTC m=+253.537168670" lastFinishedPulling="2026-03-19 15:20:27.083184578 +0000 UTC m=+286.311805780" observedRunningTime="2026-03-19 15:20:28.248712158 +0000 UTC m=+287.477333370" watchObservedRunningTime="2026-03-19 15:20:28.526845871 +0000 UTC m=+287.755467073" Mar 19 15:20:28 crc kubenswrapper[4771]: I0319 15:20:28.600197 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 19 15:20:28 crc kubenswrapper[4771]: I0319 15:20:28.625087 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 15:20:28 crc kubenswrapper[4771]: I0319 15:20:28.671196 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/74c60410-ed09-4463-9d5d-f9eac4ebc6a5-kubelet-dir\") pod \"74c60410-ed09-4463-9d5d-f9eac4ebc6a5\" (UID: \"74c60410-ed09-4463-9d5d-f9eac4ebc6a5\") " Mar 19 15:20:28 crc kubenswrapper[4771]: I0319 15:20:28.671384 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74c60410-ed09-4463-9d5d-f9eac4ebc6a5-kube-api-access\") pod \"74c60410-ed09-4463-9d5d-f9eac4ebc6a5\" (UID: \"74c60410-ed09-4463-9d5d-f9eac4ebc6a5\") " Mar 19 15:20:28 crc kubenswrapper[4771]: I0319 15:20:28.673532 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74c60410-ed09-4463-9d5d-f9eac4ebc6a5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "74c60410-ed09-4463-9d5d-f9eac4ebc6a5" (UID: "74c60410-ed09-4463-9d5d-f9eac4ebc6a5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:20:28 crc kubenswrapper[4771]: I0319 15:20:28.678130 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74c60410-ed09-4463-9d5d-f9eac4ebc6a5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "74c60410-ed09-4463-9d5d-f9eac4ebc6a5" (UID: "74c60410-ed09-4463-9d5d-f9eac4ebc6a5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:20:28 crc kubenswrapper[4771]: I0319 15:20:28.773262 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74c60410-ed09-4463-9d5d-f9eac4ebc6a5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:28 crc kubenswrapper[4771]: I0319 15:20:28.773299 4771 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/74c60410-ed09-4463-9d5d-f9eac4ebc6a5-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:29 crc kubenswrapper[4771]: I0319 15:20:29.190747 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"74c60410-ed09-4463-9d5d-f9eac4ebc6a5","Type":"ContainerDied","Data":"fe5c15255cb5150789343a4d576ff762ffe473c95ed019a942962c0f5f13f2cd"} Mar 19 15:20:29 crc kubenswrapper[4771]: I0319 15:20:29.191074 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe5c15255cb5150789343a4d576ff762ffe473c95ed019a942962c0f5f13f2cd" Mar 19 15:20:29 crc kubenswrapper[4771]: I0319 15:20:29.190977 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 15:20:29 crc kubenswrapper[4771]: I0319 15:20:29.192910 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"326c866a-522f-4c4c-91ee-7225c1a7d537","Type":"ContainerStarted","Data":"4ad51b4c972ed3816f5b8d161180c60f6d66d3478bba6a0ff0b32a4d8e86184f"} Mar 19 15:20:29 crc kubenswrapper[4771]: I0319 15:20:29.193373 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4" podUID="8718c475-3f85-4923-9d02-e569281b9d5a" containerName="route-controller-manager" containerID="cri-o://fd185355ace1ec625b3c68f39d7dc70a68fc4f8a70b26723cf5a2b1d28be4b7e" gracePeriod=30 Mar 19 15:20:29 crc kubenswrapper[4771]: I0319 15:20:29.193676 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-54b69c9fbf-ll92d" podUID="9355fee9-a66a-4e7d-b903-92cefb7c193a" containerName="controller-manager" containerID="cri-o://6172d78881ae2623107bdb0f6ac52a55221a90209674453fc55832e61373fee4" gracePeriod=30 Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.211826 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"326c866a-522f-4c4c-91ee-7225c1a7d537","Type":"ContainerStarted","Data":"3b64762ddce80c2a090667db7d5d20e028b3200ca1f4d2f5c5f9360ac8f01363"} Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.215696 4771 generic.go:334] "Generic (PLEG): container finished" podID="9355fee9-a66a-4e7d-b903-92cefb7c193a" containerID="6172d78881ae2623107bdb0f6ac52a55221a90209674453fc55832e61373fee4" exitCode=0 Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.215755 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54b69c9fbf-ll92d" event={"ID":"9355fee9-a66a-4e7d-b903-92cefb7c193a","Type":"ContainerDied","Data":"6172d78881ae2623107bdb0f6ac52a55221a90209674453fc55832e61373fee4"} Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.222208 4771 generic.go:334] "Generic (PLEG): container finished" podID="8718c475-3f85-4923-9d02-e569281b9d5a" containerID="fd185355ace1ec625b3c68f39d7dc70a68fc4f8a70b26723cf5a2b1d28be4b7e" exitCode=0 Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.222307 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4" event={"ID":"8718c475-3f85-4923-9d02-e569281b9d5a","Type":"ContainerDied","Data":"fd185355ace1ec625b3c68f39d7dc70a68fc4f8a70b26723cf5a2b1d28be4b7e"} Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.827067 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54b69c9fbf-ll92d" Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.830745 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4" Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.909264 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8718c475-3f85-4923-9d02-e569281b9d5a-config\") pod \"8718c475-3f85-4923-9d02-e569281b9d5a\" (UID: \"8718c475-3f85-4923-9d02-e569281b9d5a\") " Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.909314 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9355fee9-a66a-4e7d-b903-92cefb7c193a-config\") pod \"9355fee9-a66a-4e7d-b903-92cefb7c193a\" (UID: \"9355fee9-a66a-4e7d-b903-92cefb7c193a\") " Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.909347 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9355fee9-a66a-4e7d-b903-92cefb7c193a-serving-cert\") pod \"9355fee9-a66a-4e7d-b903-92cefb7c193a\" (UID: \"9355fee9-a66a-4e7d-b903-92cefb7c193a\") " Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.909372 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vl56f\" (UniqueName: \"kubernetes.io/projected/8718c475-3f85-4923-9d02-e569281b9d5a-kube-api-access-vl56f\") pod \"8718c475-3f85-4923-9d02-e569281b9d5a\" (UID: \"8718c475-3f85-4923-9d02-e569281b9d5a\") " Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.909420 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8718c475-3f85-4923-9d02-e569281b9d5a-client-ca\") pod \"8718c475-3f85-4923-9d02-e569281b9d5a\" (UID: \"8718c475-3f85-4923-9d02-e569281b9d5a\") " Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.909439 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8718c475-3f85-4923-9d02-e569281b9d5a-serving-cert\") pod \"8718c475-3f85-4923-9d02-e569281b9d5a\" (UID: \"8718c475-3f85-4923-9d02-e569281b9d5a\") " Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.909474 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nll8v\" (UniqueName: \"kubernetes.io/projected/9355fee9-a66a-4e7d-b903-92cefb7c193a-kube-api-access-nll8v\") pod \"9355fee9-a66a-4e7d-b903-92cefb7c193a\" (UID: \"9355fee9-a66a-4e7d-b903-92cefb7c193a\") " Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.909574 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9355fee9-a66a-4e7d-b903-92cefb7c193a-client-ca\") pod \"9355fee9-a66a-4e7d-b903-92cefb7c193a\" (UID: \"9355fee9-a66a-4e7d-b903-92cefb7c193a\") " Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.909615 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9355fee9-a66a-4e7d-b903-92cefb7c193a-proxy-ca-bundles\") pod \"9355fee9-a66a-4e7d-b903-92cefb7c193a\" (UID: \"9355fee9-a66a-4e7d-b903-92cefb7c193a\") " Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.910309 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9355fee9-a66a-4e7d-b903-92cefb7c193a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9355fee9-a66a-4e7d-b903-92cefb7c193a" (UID: "9355fee9-a66a-4e7d-b903-92cefb7c193a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.910435 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9355fee9-a66a-4e7d-b903-92cefb7c193a-config" (OuterVolumeSpecName: "config") pod "9355fee9-a66a-4e7d-b903-92cefb7c193a" (UID: "9355fee9-a66a-4e7d-b903-92cefb7c193a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.910966 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9355fee9-a66a-4e7d-b903-92cefb7c193a-client-ca" (OuterVolumeSpecName: "client-ca") pod "9355fee9-a66a-4e7d-b903-92cefb7c193a" (UID: "9355fee9-a66a-4e7d-b903-92cefb7c193a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.911122 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8718c475-3f85-4923-9d02-e569281b9d5a-config" (OuterVolumeSpecName: "config") pod "8718c475-3f85-4923-9d02-e569281b9d5a" (UID: "8718c475-3f85-4923-9d02-e569281b9d5a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.911419 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8718c475-3f85-4923-9d02-e569281b9d5a-client-ca" (OuterVolumeSpecName: "client-ca") pod "8718c475-3f85-4923-9d02-e569281b9d5a" (UID: "8718c475-3f85-4923-9d02-e569281b9d5a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.929227 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8718c475-3f85-4923-9d02-e569281b9d5a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8718c475-3f85-4923-9d02-e569281b9d5a" (UID: "8718c475-3f85-4923-9d02-e569281b9d5a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.929237 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9355fee9-a66a-4e7d-b903-92cefb7c193a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9355fee9-a66a-4e7d-b903-92cefb7c193a" (UID: "9355fee9-a66a-4e7d-b903-92cefb7c193a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.929281 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9355fee9-a66a-4e7d-b903-92cefb7c193a-kube-api-access-nll8v" (OuterVolumeSpecName: "kube-api-access-nll8v") pod "9355fee9-a66a-4e7d-b903-92cefb7c193a" (UID: "9355fee9-a66a-4e7d-b903-92cefb7c193a"). InnerVolumeSpecName "kube-api-access-nll8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.929303 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8718c475-3f85-4923-9d02-e569281b9d5a-kube-api-access-vl56f" (OuterVolumeSpecName: "kube-api-access-vl56f") pod "8718c475-3f85-4923-9d02-e569281b9d5a" (UID: "8718c475-3f85-4923-9d02-e569281b9d5a"). InnerVolumeSpecName "kube-api-access-vl56f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.973776 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6b9f96b47-8f9cz"] Mar 19 15:20:30 crc kubenswrapper[4771]: E0319 15:20:30.974104 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74c60410-ed09-4463-9d5d-f9eac4ebc6a5" containerName="pruner" Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.974116 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="74c60410-ed09-4463-9d5d-f9eac4ebc6a5" containerName="pruner" Mar 19 15:20:30 crc kubenswrapper[4771]: E0319 15:20:30.974127 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9355fee9-a66a-4e7d-b903-92cefb7c193a" containerName="controller-manager" Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.974134 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9355fee9-a66a-4e7d-b903-92cefb7c193a" containerName="controller-manager" Mar 19 15:20:30 crc kubenswrapper[4771]: E0319 15:20:30.974146 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8718c475-3f85-4923-9d02-e569281b9d5a" containerName="route-controller-manager" Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.974152 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8718c475-3f85-4923-9d02-e569281b9d5a" containerName="route-controller-manager" Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.974243 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="8718c475-3f85-4923-9d02-e569281b9d5a" containerName="route-controller-manager" Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.974251 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9355fee9-a66a-4e7d-b903-92cefb7c193a" containerName="controller-manager" Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.974260 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="74c60410-ed09-4463-9d5d-f9eac4ebc6a5" containerName="pruner" Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.974636 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b9f96b47-8f9cz" Mar 19 15:20:30 crc kubenswrapper[4771]: I0319 15:20:30.977685 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b9f96b47-8f9cz"] Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.012143 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlwtv\" (UniqueName: \"kubernetes.io/projected/fb41ac02-a683-418a-8c0a-97d3690cbe7a-kube-api-access-mlwtv\") pod \"controller-manager-6b9f96b47-8f9cz\" (UID: \"fb41ac02-a683-418a-8c0a-97d3690cbe7a\") " pod="openshift-controller-manager/controller-manager-6b9f96b47-8f9cz" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.012242 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb41ac02-a683-418a-8c0a-97d3690cbe7a-serving-cert\") pod \"controller-manager-6b9f96b47-8f9cz\" (UID: \"fb41ac02-a683-418a-8c0a-97d3690cbe7a\") " pod="openshift-controller-manager/controller-manager-6b9f96b47-8f9cz" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.012308 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb41ac02-a683-418a-8c0a-97d3690cbe7a-config\") pod \"controller-manager-6b9f96b47-8f9cz\" (UID: \"fb41ac02-a683-418a-8c0a-97d3690cbe7a\") " pod="openshift-controller-manager/controller-manager-6b9f96b47-8f9cz" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.012337 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb41ac02-a683-418a-8c0a-97d3690cbe7a-client-ca\") pod \"controller-manager-6b9f96b47-8f9cz\" (UID: \"fb41ac02-a683-418a-8c0a-97d3690cbe7a\") " pod="openshift-controller-manager/controller-manager-6b9f96b47-8f9cz" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.012385 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb41ac02-a683-418a-8c0a-97d3690cbe7a-proxy-ca-bundles\") pod \"controller-manager-6b9f96b47-8f9cz\" (UID: \"fb41ac02-a683-418a-8c0a-97d3690cbe7a\") " pod="openshift-controller-manager/controller-manager-6b9f96b47-8f9cz" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.012435 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9355fee9-a66a-4e7d-b903-92cefb7c193a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.012447 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9355fee9-a66a-4e7d-b903-92cefb7c193a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.012459 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8718c475-3f85-4923-9d02-e569281b9d5a-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.012470 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9355fee9-a66a-4e7d-b903-92cefb7c193a-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.012480 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9355fee9-a66a-4e7d-b903-92cefb7c193a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.012493 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vl56f\" (UniqueName: \"kubernetes.io/projected/8718c475-3f85-4923-9d02-e569281b9d5a-kube-api-access-vl56f\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.012668 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8718c475-3f85-4923-9d02-e569281b9d5a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.012683 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8718c475-3f85-4923-9d02-e569281b9d5a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.012697 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nll8v\" (UniqueName: \"kubernetes.io/projected/9355fee9-a66a-4e7d-b903-92cefb7c193a-kube-api-access-nll8v\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.113065 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlwtv\" (UniqueName: \"kubernetes.io/projected/fb41ac02-a683-418a-8c0a-97d3690cbe7a-kube-api-access-mlwtv\") pod \"controller-manager-6b9f96b47-8f9cz\" (UID: \"fb41ac02-a683-418a-8c0a-97d3690cbe7a\") " pod="openshift-controller-manager/controller-manager-6b9f96b47-8f9cz" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.113129 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb41ac02-a683-418a-8c0a-97d3690cbe7a-serving-cert\") pod \"controller-manager-6b9f96b47-8f9cz\" (UID: \"fb41ac02-a683-418a-8c0a-97d3690cbe7a\") " pod="openshift-controller-manager/controller-manager-6b9f96b47-8f9cz" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.113177 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb41ac02-a683-418a-8c0a-97d3690cbe7a-config\") pod \"controller-manager-6b9f96b47-8f9cz\" (UID: \"fb41ac02-a683-418a-8c0a-97d3690cbe7a\") " pod="openshift-controller-manager/controller-manager-6b9f96b47-8f9cz" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.113195 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb41ac02-a683-418a-8c0a-97d3690cbe7a-client-ca\") pod \"controller-manager-6b9f96b47-8f9cz\" (UID: \"fb41ac02-a683-418a-8c0a-97d3690cbe7a\") " pod="openshift-controller-manager/controller-manager-6b9f96b47-8f9cz" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.113232 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb41ac02-a683-418a-8c0a-97d3690cbe7a-proxy-ca-bundles\") pod \"controller-manager-6b9f96b47-8f9cz\" (UID: \"fb41ac02-a683-418a-8c0a-97d3690cbe7a\") " pod="openshift-controller-manager/controller-manager-6b9f96b47-8f9cz" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.114331 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb41ac02-a683-418a-8c0a-97d3690cbe7a-client-ca\") pod \"controller-manager-6b9f96b47-8f9cz\" (UID: \"fb41ac02-a683-418a-8c0a-97d3690cbe7a\") " pod="openshift-controller-manager/controller-manager-6b9f96b47-8f9cz" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.114688 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb41ac02-a683-418a-8c0a-97d3690cbe7a-proxy-ca-bundles\") pod \"controller-manager-6b9f96b47-8f9cz\" (UID: \"fb41ac02-a683-418a-8c0a-97d3690cbe7a\") " pod="openshift-controller-manager/controller-manager-6b9f96b47-8f9cz" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.115088 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb41ac02-a683-418a-8c0a-97d3690cbe7a-config\") pod \"controller-manager-6b9f96b47-8f9cz\" (UID: \"fb41ac02-a683-418a-8c0a-97d3690cbe7a\") " pod="openshift-controller-manager/controller-manager-6b9f96b47-8f9cz" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.118604 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb41ac02-a683-418a-8c0a-97d3690cbe7a-serving-cert\") pod \"controller-manager-6b9f96b47-8f9cz\" (UID: \"fb41ac02-a683-418a-8c0a-97d3690cbe7a\") " pod="openshift-controller-manager/controller-manager-6b9f96b47-8f9cz" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.129164 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlwtv\" (UniqueName: \"kubernetes.io/projected/fb41ac02-a683-418a-8c0a-97d3690cbe7a-kube-api-access-mlwtv\") pod \"controller-manager-6b9f96b47-8f9cz\" (UID: \"fb41ac02-a683-418a-8c0a-97d3690cbe7a\") " pod="openshift-controller-manager/controller-manager-6b9f96b47-8f9cz" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.233552 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54b69c9fbf-ll92d" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.233569 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54b69c9fbf-ll92d" event={"ID":"9355fee9-a66a-4e7d-b903-92cefb7c193a","Type":"ContainerDied","Data":"d13a467903898688ac022f1a78759c3092496cb2084f123f9b0fe5a869758736"} Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.234232 4771 scope.go:117] "RemoveContainer" containerID="6172d78881ae2623107bdb0f6ac52a55221a90209674453fc55832e61373fee4" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.236595 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4" event={"ID":"8718c475-3f85-4923-9d02-e569281b9d5a","Type":"ContainerDied","Data":"d1981ca87fe984e0d4b470033e0f8ae8e71ee00fcc02c80a0f6355e0d21a5f69"} Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.236664 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.254023 4771 generic.go:334] "Generic (PLEG): container finished" podID="326c866a-522f-4c4c-91ee-7225c1a7d537" containerID="3b64762ddce80c2a090667db7d5d20e028b3200ca1f4d2f5c5f9360ac8f01363" exitCode=0 Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.254140 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"326c866a-522f-4c4c-91ee-7225c1a7d537","Type":"ContainerDied","Data":"3b64762ddce80c2a090667db7d5d20e028b3200ca1f4d2f5c5f9360ac8f01363"} Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.278421 4771 scope.go:117] "RemoveContainer" containerID="fd185355ace1ec625b3c68f39d7dc70a68fc4f8a70b26723cf5a2b1d28be4b7e" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.292846 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b9f96b47-8f9cz" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.300205 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4"] Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.306018 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b4b4bcb8f-5vrf4"] Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.309722 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54b69c9fbf-ll92d"] Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.317828 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-54b69c9fbf-ll92d"] Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.474483 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b9f96b47-8f9cz"] Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.519474 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8718c475-3f85-4923-9d02-e569281b9d5a" path="/var/lib/kubelet/pods/8718c475-3f85-4923-9d02-e569281b9d5a/volumes" Mar 19 15:20:31 crc kubenswrapper[4771]: I0319 15:20:31.520296 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9355fee9-a66a-4e7d-b903-92cefb7c193a" path="/var/lib/kubelet/pods/9355fee9-a66a-4e7d-b903-92cefb7c193a/volumes" Mar 19 15:20:32 crc kubenswrapper[4771]: I0319 15:20:32.261124 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b9f96b47-8f9cz" event={"ID":"fb41ac02-a683-418a-8c0a-97d3690cbe7a","Type":"ContainerStarted","Data":"66b8fb6a57ce4735789db82deb2fb82b704c7545f5eb74e0dfb1afe052b4480c"} Mar 19 15:20:32 crc kubenswrapper[4771]: I0319 15:20:32.995130 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rlzb8" Mar 19 15:20:32 crc kubenswrapper[4771]: I0319 15:20:32.995586 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rlzb8" Mar 19 15:20:33 crc kubenswrapper[4771]: I0319 15:20:33.276798 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b9f96b47-8f9cz" event={"ID":"fb41ac02-a683-418a-8c0a-97d3690cbe7a","Type":"ContainerStarted","Data":"62706817bee080bfe790b5184eac170a2409c42d60a52506274ba77459ffb72f"} Mar 19 15:20:33 crc kubenswrapper[4771]: I0319 15:20:33.279635 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6b9f96b47-8f9cz" Mar 19 15:20:33 crc kubenswrapper[4771]: I0319 15:20:33.292581 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6b9f96b47-8f9cz" Mar 19 15:20:33 crc kubenswrapper[4771]: I0319 15:20:33.303878 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6b9f96b47-8f9cz" podStartSLOduration=6.303859464 podStartE2EDuration="6.303859464s" podCreationTimestamp="2026-03-19 15:20:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:20:33.300081856 +0000 UTC m=+292.528703058" watchObservedRunningTime="2026-03-19 15:20:33.303859464 +0000 UTC m=+292.532480666" Mar 19 15:20:33 crc kubenswrapper[4771]: I0319 15:20:33.616523 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-566qg" Mar 19 15:20:33 crc kubenswrapper[4771]: I0319 15:20:33.616671 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-566qg" Mar 19 15:20:34 crc kubenswrapper[4771]: I0319 15:20:34.066959 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qj9qq"] Mar 19 15:20:34 crc kubenswrapper[4771]: I0319 15:20:34.553969 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 19 15:20:34 crc kubenswrapper[4771]: I0319 15:20:34.556039 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 15:20:34 crc kubenswrapper[4771]: I0319 15:20:34.583183 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 19 15:20:34 crc kubenswrapper[4771]: I0319 15:20:34.674921 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6730def2-f246-44ea-9ee9-93f4e490008d-kube-api-access\") pod \"installer-9-crc\" (UID: \"6730def2-f246-44ea-9ee9-93f4e490008d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 15:20:34 crc kubenswrapper[4771]: I0319 15:20:34.675054 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6730def2-f246-44ea-9ee9-93f4e490008d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6730def2-f246-44ea-9ee9-93f4e490008d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 15:20:34 crc kubenswrapper[4771]: I0319 15:20:34.675085 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6730def2-f246-44ea-9ee9-93f4e490008d-var-lock\") pod \"installer-9-crc\" (UID: \"6730def2-f246-44ea-9ee9-93f4e490008d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 15:20:34 crc kubenswrapper[4771]: I0319 15:20:34.696467 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-566qg" Mar 19 15:20:34 crc kubenswrapper[4771]: I0319 15:20:34.697680 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rlzb8" Mar 19 15:20:34 crc kubenswrapper[4771]: I0319 15:20:34.776508 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6730def2-f246-44ea-9ee9-93f4e490008d-kube-api-access\") pod \"installer-9-crc\" (UID: \"6730def2-f246-44ea-9ee9-93f4e490008d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 15:20:34 crc kubenswrapper[4771]: I0319 15:20:34.776621 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6730def2-f246-44ea-9ee9-93f4e490008d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6730def2-f246-44ea-9ee9-93f4e490008d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 15:20:34 crc kubenswrapper[4771]: I0319 15:20:34.776677 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6730def2-f246-44ea-9ee9-93f4e490008d-var-lock\") pod \"installer-9-crc\" (UID: \"6730def2-f246-44ea-9ee9-93f4e490008d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 15:20:34 crc kubenswrapper[4771]: I0319 15:20:34.776845 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6730def2-f246-44ea-9ee9-93f4e490008d-var-lock\") pod \"installer-9-crc\" (UID: \"6730def2-f246-44ea-9ee9-93f4e490008d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 15:20:34 crc kubenswrapper[4771]: I0319 15:20:34.776852 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6730def2-f246-44ea-9ee9-93f4e490008d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6730def2-f246-44ea-9ee9-93f4e490008d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 15:20:34 crc kubenswrapper[4771]: I0319 15:20:34.783450 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rlzb8" Mar 19 15:20:34 crc kubenswrapper[4771]: I0319 15:20:34.802642 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6730def2-f246-44ea-9ee9-93f4e490008d-kube-api-access\") pod \"installer-9-crc\" (UID: \"6730def2-f246-44ea-9ee9-93f4e490008d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 15:20:34 crc kubenswrapper[4771]: I0319 15:20:34.889555 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 15:20:34 crc kubenswrapper[4771]: I0319 15:20:34.977101 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n"] Mar 19 15:20:34 crc kubenswrapper[4771]: I0319 15:20:34.977763 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n" Mar 19 15:20:34 crc kubenswrapper[4771]: I0319 15:20:34.981113 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 15:20:34 crc kubenswrapper[4771]: I0319 15:20:34.981883 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 15:20:34 crc kubenswrapper[4771]: I0319 15:20:34.983223 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 15:20:34 crc kubenswrapper[4771]: I0319 15:20:34.983503 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 15:20:34 crc kubenswrapper[4771]: I0319 15:20:34.983520 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 15:20:34 crc kubenswrapper[4771]: I0319 15:20:34.983867 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 15:20:34 crc kubenswrapper[4771]: I0319 15:20:34.988278 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n"] Mar 19 15:20:35 crc kubenswrapper[4771]: I0319 15:20:35.008711 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dmtnc" Mar 19 15:20:35 crc kubenswrapper[4771]: I0319 15:20:35.008764 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dmtnc" Mar 19 15:20:35 crc kubenswrapper[4771]: I0319 15:20:35.074106 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dmtnc" Mar 19 15:20:35 crc kubenswrapper[4771]: I0319 15:20:35.083284 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f867f7d-795a-44c9-9d88-eeb13f54bc1c-config\") pod \"route-controller-manager-9ddc4fc97-wb22n\" (UID: \"3f867f7d-795a-44c9-9d88-eeb13f54bc1c\") " pod="openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n" Mar 19 15:20:35 crc kubenswrapper[4771]: I0319 15:20:35.083517 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49bzf\" (UniqueName: \"kubernetes.io/projected/3f867f7d-795a-44c9-9d88-eeb13f54bc1c-kube-api-access-49bzf\") pod \"route-controller-manager-9ddc4fc97-wb22n\" (UID: \"3f867f7d-795a-44c9-9d88-eeb13f54bc1c\") " pod="openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n" Mar 19 15:20:35 crc kubenswrapper[4771]: I0319 15:20:35.083687 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f867f7d-795a-44c9-9d88-eeb13f54bc1c-client-ca\") pod \"route-controller-manager-9ddc4fc97-wb22n\" (UID: \"3f867f7d-795a-44c9-9d88-eeb13f54bc1c\") " pod="openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n" Mar 19 15:20:35 crc kubenswrapper[4771]: I0319 15:20:35.083743 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f867f7d-795a-44c9-9d88-eeb13f54bc1c-serving-cert\") pod \"route-controller-manager-9ddc4fc97-wb22n\" (UID: \"3f867f7d-795a-44c9-9d88-eeb13f54bc1c\") " pod="openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n" Mar 19 15:20:35 crc kubenswrapper[4771]: I0319 15:20:35.185310 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49bzf\" (UniqueName: \"kubernetes.io/projected/3f867f7d-795a-44c9-9d88-eeb13f54bc1c-kube-api-access-49bzf\") pod \"route-controller-manager-9ddc4fc97-wb22n\" (UID: \"3f867f7d-795a-44c9-9d88-eeb13f54bc1c\") " pod="openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n" Mar 19 15:20:35 crc kubenswrapper[4771]: I0319 15:20:35.185401 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f867f7d-795a-44c9-9d88-eeb13f54bc1c-client-ca\") pod \"route-controller-manager-9ddc4fc97-wb22n\" (UID: \"3f867f7d-795a-44c9-9d88-eeb13f54bc1c\") " pod="openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n" Mar 19 15:20:35 crc kubenswrapper[4771]: I0319 15:20:35.185435 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f867f7d-795a-44c9-9d88-eeb13f54bc1c-serving-cert\") pod \"route-controller-manager-9ddc4fc97-wb22n\" (UID: \"3f867f7d-795a-44c9-9d88-eeb13f54bc1c\") " pod="openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n" Mar 19 15:20:35 crc kubenswrapper[4771]: I0319 15:20:35.185527 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f867f7d-795a-44c9-9d88-eeb13f54bc1c-config\") pod \"route-controller-manager-9ddc4fc97-wb22n\" (UID: \"3f867f7d-795a-44c9-9d88-eeb13f54bc1c\") " pod="openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n" Mar 19 15:20:35 crc kubenswrapper[4771]: I0319 15:20:35.186973 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f867f7d-795a-44c9-9d88-eeb13f54bc1c-client-ca\") pod \"route-controller-manager-9ddc4fc97-wb22n\" (UID: \"3f867f7d-795a-44c9-9d88-eeb13f54bc1c\") " pod="openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n" Mar 19 15:20:35 crc kubenswrapper[4771]: I0319 15:20:35.187283 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f867f7d-795a-44c9-9d88-eeb13f54bc1c-config\") pod \"route-controller-manager-9ddc4fc97-wb22n\" (UID: \"3f867f7d-795a-44c9-9d88-eeb13f54bc1c\") " pod="openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n" Mar 19 15:20:35 crc kubenswrapper[4771]: I0319 15:20:35.191270 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f867f7d-795a-44c9-9d88-eeb13f54bc1c-serving-cert\") pod \"route-controller-manager-9ddc4fc97-wb22n\" (UID: \"3f867f7d-795a-44c9-9d88-eeb13f54bc1c\") " pod="openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n" Mar 19 15:20:35 crc kubenswrapper[4771]: I0319 15:20:35.212110 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49bzf\" (UniqueName: \"kubernetes.io/projected/3f867f7d-795a-44c9-9d88-eeb13f54bc1c-kube-api-access-49bzf\") pod \"route-controller-manager-9ddc4fc97-wb22n\" (UID: \"3f867f7d-795a-44c9-9d88-eeb13f54bc1c\") " pod="openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n" Mar 19 15:20:35 crc kubenswrapper[4771]: I0319 15:20:35.314490 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n" Mar 19 15:20:35 crc kubenswrapper[4771]: I0319 15:20:35.338302 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dmtnc" Mar 19 15:20:35 crc kubenswrapper[4771]: I0319 15:20:35.343225 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-566qg" Mar 19 15:20:35 crc kubenswrapper[4771]: I0319 15:20:35.404910 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g7zl5" Mar 19 15:20:35 crc kubenswrapper[4771]: I0319 15:20:35.405224 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g7zl5" Mar 19 15:20:35 crc kubenswrapper[4771]: I0319 15:20:35.448416 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g7zl5" Mar 19 15:20:36 crc kubenswrapper[4771]: I0319 15:20:36.391529 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g7zl5" Mar 19 15:20:36 crc kubenswrapper[4771]: I0319 15:20:36.440143 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 15:20:36 crc kubenswrapper[4771]: I0319 15:20:36.551174 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/326c866a-522f-4c4c-91ee-7225c1a7d537-kube-api-access\") pod \"326c866a-522f-4c4c-91ee-7225c1a7d537\" (UID: \"326c866a-522f-4c4c-91ee-7225c1a7d537\") " Mar 19 15:20:36 crc kubenswrapper[4771]: I0319 15:20:36.551246 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/326c866a-522f-4c4c-91ee-7225c1a7d537-kubelet-dir\") pod \"326c866a-522f-4c4c-91ee-7225c1a7d537\" (UID: \"326c866a-522f-4c4c-91ee-7225c1a7d537\") " Mar 19 15:20:36 crc kubenswrapper[4771]: I0319 15:20:36.551774 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/326c866a-522f-4c4c-91ee-7225c1a7d537-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "326c866a-522f-4c4c-91ee-7225c1a7d537" (UID: "326c866a-522f-4c4c-91ee-7225c1a7d537"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:20:36 crc kubenswrapper[4771]: I0319 15:20:36.560217 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/326c866a-522f-4c4c-91ee-7225c1a7d537-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "326c866a-522f-4c4c-91ee-7225c1a7d537" (UID: "326c866a-522f-4c4c-91ee-7225c1a7d537"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:20:36 crc kubenswrapper[4771]: I0319 15:20:36.652608 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/326c866a-522f-4c4c-91ee-7225c1a7d537-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:36 crc kubenswrapper[4771]: I0319 15:20:36.652648 4771 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/326c866a-522f-4c4c-91ee-7225c1a7d537-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:37 crc kubenswrapper[4771]: I0319 15:20:37.305549 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"326c866a-522f-4c4c-91ee-7225c1a7d537","Type":"ContainerDied","Data":"4ad51b4c972ed3816f5b8d161180c60f6d66d3478bba6a0ff0b32a4d8e86184f"} Mar 19 15:20:37 crc kubenswrapper[4771]: I0319 15:20:37.305683 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 15:20:37 crc kubenswrapper[4771]: I0319 15:20:37.306027 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ad51b4c972ed3816f5b8d161180c60f6d66d3478bba6a0ff0b32a4d8e86184f" Mar 19 15:20:37 crc kubenswrapper[4771]: I0319 15:20:37.388423 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g7zl5"] Mar 19 15:20:37 crc kubenswrapper[4771]: I0319 15:20:37.985943 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-566qg"] Mar 19 15:20:37 crc kubenswrapper[4771]: I0319 15:20:37.986271 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-566qg" podUID="9b59b88d-6b8a-43dd-ae40-3091d533b8ae" containerName="registry-server" containerID="cri-o://a4b6377fa7de1d216b7e77a11f725bfef1a75cbc9fea07947b9174e8648bd060" gracePeriod=2 Mar 19 15:20:38 crc kubenswrapper[4771]: I0319 15:20:38.309896 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-g7zl5" podUID="cc023f86-d23c-4ca1-810a-de7ece9bb340" containerName="registry-server" containerID="cri-o://b4d78c3918ca82b3e6993e1c2d6ee7453ad52bc1505b5abbfc8d19c0350e24d7" gracePeriod=2 Mar 19 15:20:39 crc kubenswrapper[4771]: I0319 15:20:39.319189 4771 generic.go:334] "Generic (PLEG): container finished" podID="cc023f86-d23c-4ca1-810a-de7ece9bb340" containerID="b4d78c3918ca82b3e6993e1c2d6ee7453ad52bc1505b5abbfc8d19c0350e24d7" exitCode=0 Mar 19 15:20:39 crc kubenswrapper[4771]: I0319 15:20:39.319259 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7zl5" event={"ID":"cc023f86-d23c-4ca1-810a-de7ece9bb340","Type":"ContainerDied","Data":"b4d78c3918ca82b3e6993e1c2d6ee7453ad52bc1505b5abbfc8d19c0350e24d7"} Mar 19 15:20:40 crc kubenswrapper[4771]: I0319 15:20:40.328835 4771 generic.go:334] "Generic (PLEG): container finished" podID="9b59b88d-6b8a-43dd-ae40-3091d533b8ae" containerID="a4b6377fa7de1d216b7e77a11f725bfef1a75cbc9fea07947b9174e8648bd060" exitCode=0 Mar 19 15:20:40 crc kubenswrapper[4771]: I0319 15:20:40.328893 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-566qg" event={"ID":"9b59b88d-6b8a-43dd-ae40-3091d533b8ae","Type":"ContainerDied","Data":"a4b6377fa7de1d216b7e77a11f725bfef1a75cbc9fea07947b9174e8648bd060"} Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.216534 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g7zl5" Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.325103 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-566qg" Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.330292 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc023f86-d23c-4ca1-810a-de7ece9bb340-utilities\") pod \"cc023f86-d23c-4ca1-810a-de7ece9bb340\" (UID: \"cc023f86-d23c-4ca1-810a-de7ece9bb340\") " Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.330384 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxvd6\" (UniqueName: \"kubernetes.io/projected/cc023f86-d23c-4ca1-810a-de7ece9bb340-kube-api-access-kxvd6\") pod \"cc023f86-d23c-4ca1-810a-de7ece9bb340\" (UID: \"cc023f86-d23c-4ca1-810a-de7ece9bb340\") " Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.330478 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc023f86-d23c-4ca1-810a-de7ece9bb340-catalog-content\") pod \"cc023f86-d23c-4ca1-810a-de7ece9bb340\" (UID: \"cc023f86-d23c-4ca1-810a-de7ece9bb340\") " Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.331484 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc023f86-d23c-4ca1-810a-de7ece9bb340-utilities" (OuterVolumeSpecName: "utilities") pod "cc023f86-d23c-4ca1-810a-de7ece9bb340" (UID: "cc023f86-d23c-4ca1-810a-de7ece9bb340"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.337093 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc023f86-d23c-4ca1-810a-de7ece9bb340-kube-api-access-kxvd6" (OuterVolumeSpecName: "kube-api-access-kxvd6") pod "cc023f86-d23c-4ca1-810a-de7ece9bb340" (UID: "cc023f86-d23c-4ca1-810a-de7ece9bb340"). InnerVolumeSpecName "kube-api-access-kxvd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.339422 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g7zl5" Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.339482 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7zl5" event={"ID":"cc023f86-d23c-4ca1-810a-de7ece9bb340","Type":"ContainerDied","Data":"c2c6e67d37b7443339f3db8acc75ab8b026540a050c3c8db3b7abbe492c04503"} Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.339700 4771 scope.go:117] "RemoveContainer" containerID="b4d78c3918ca82b3e6993e1c2d6ee7453ad52bc1505b5abbfc8d19c0350e24d7" Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.343223 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tdfqz" event={"ID":"7f49e754-62f6-4f17-a6ed-fe5e3abe32b4","Type":"ContainerStarted","Data":"777aa2e708d91c6d7a2c38559877bfefe31f239aed167c6666ebff766402d8dc"} Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.359676 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dkz7z" event={"ID":"1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07","Type":"ContainerStarted","Data":"ba7700933f4df7669048b42b968e0f31c6b37f46cabdbbbe3dc933d65437035f"} Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.361883 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-566qg" event={"ID":"9b59b88d-6b8a-43dd-ae40-3091d533b8ae","Type":"ContainerDied","Data":"9fdfa9fd900f326db938c19c2f7cd613357f9b0ba350d21eb70dd2a28956c1c3"} Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.362033 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-566qg" Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.371645 4771 scope.go:117] "RemoveContainer" containerID="c01334afb4bbbb050969cebde0e2b368a27f1fec2c7f8cac60f1df629a49f237" Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.406260 4771 scope.go:117] "RemoveContainer" containerID="39a362076694bcd779c42b709b9152b5bbebee011b78abe9e20a0f299b24f2d2" Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.424110 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc023f86-d23c-4ca1-810a-de7ece9bb340-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc023f86-d23c-4ca1-810a-de7ece9bb340" (UID: "cc023f86-d23c-4ca1-810a-de7ece9bb340"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.433544 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b59b88d-6b8a-43dd-ae40-3091d533b8ae-catalog-content\") pod \"9b59b88d-6b8a-43dd-ae40-3091d533b8ae\" (UID: \"9b59b88d-6b8a-43dd-ae40-3091d533b8ae\") " Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.433667 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b59b88d-6b8a-43dd-ae40-3091d533b8ae-utilities\") pod \"9b59b88d-6b8a-43dd-ae40-3091d533b8ae\" (UID: \"9b59b88d-6b8a-43dd-ae40-3091d533b8ae\") " Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.433715 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h947r\" (UniqueName: \"kubernetes.io/projected/9b59b88d-6b8a-43dd-ae40-3091d533b8ae-kube-api-access-h947r\") pod \"9b59b88d-6b8a-43dd-ae40-3091d533b8ae\" (UID: \"9b59b88d-6b8a-43dd-ae40-3091d533b8ae\") " Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.434167 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc023f86-d23c-4ca1-810a-de7ece9bb340-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.434191 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxvd6\" (UniqueName: \"kubernetes.io/projected/cc023f86-d23c-4ca1-810a-de7ece9bb340-kube-api-access-kxvd6\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.434205 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc023f86-d23c-4ca1-810a-de7ece9bb340-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.435335 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b59b88d-6b8a-43dd-ae40-3091d533b8ae-utilities" (OuterVolumeSpecName: "utilities") pod "9b59b88d-6b8a-43dd-ae40-3091d533b8ae" (UID: "9b59b88d-6b8a-43dd-ae40-3091d533b8ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.441853 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b59b88d-6b8a-43dd-ae40-3091d533b8ae-kube-api-access-h947r" (OuterVolumeSpecName: "kube-api-access-h947r") pod "9b59b88d-6b8a-43dd-ae40-3091d533b8ae" (UID: "9b59b88d-6b8a-43dd-ae40-3091d533b8ae"). InnerVolumeSpecName "kube-api-access-h947r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.442036 4771 scope.go:117] "RemoveContainer" containerID="a4b6377fa7de1d216b7e77a11f725bfef1a75cbc9fea07947b9174e8648bd060" Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.456934 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.481551 4771 scope.go:117] "RemoveContainer" containerID="b3568afb07849a298f2c1b0ac41b3b25c0fd2ddd179cffef2413b1b5c776c28e" Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.529241 4771 scope.go:117] "RemoveContainer" containerID="cb6b3e16022f09819a9b95db9c47ccbbd7c7536db00bf90365302dc63e377758" Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.537801 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b59b88d-6b8a-43dd-ae40-3091d533b8ae-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.538026 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h947r\" (UniqueName: \"kubernetes.io/projected/9b59b88d-6b8a-43dd-ae40-3091d533b8ae-kube-api-access-h947r\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.572033 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n"] Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.643742 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b59b88d-6b8a-43dd-ae40-3091d533b8ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b59b88d-6b8a-43dd-ae40-3091d533b8ae" (UID: "9b59b88d-6b8a-43dd-ae40-3091d533b8ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.661860 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g7zl5"] Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.665824 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-g7zl5"] Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.741429 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-566qg"] Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.742761 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b59b88d-6b8a-43dd-ae40-3091d533b8ae-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.763826 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-566qg"] Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.938338 4771 csr.go:261] certificate signing request csr-rvhc5 is approved, waiting to be issued Mar 19 15:20:41 crc kubenswrapper[4771]: I0319 15:20:41.947141 4771 csr.go:257] certificate signing request csr-rvhc5 is issued Mar 19 15:20:42 crc kubenswrapper[4771]: I0319 15:20:42.368512 4771 generic.go:334] "Generic (PLEG): container finished" podID="7f49e754-62f6-4f17-a6ed-fe5e3abe32b4" containerID="777aa2e708d91c6d7a2c38559877bfefe31f239aed167c6666ebff766402d8dc" exitCode=0 Mar 19 15:20:42 crc kubenswrapper[4771]: I0319 15:20:42.368568 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tdfqz" event={"ID":"7f49e754-62f6-4f17-a6ed-fe5e3abe32b4","Type":"ContainerDied","Data":"777aa2e708d91c6d7a2c38559877bfefe31f239aed167c6666ebff766402d8dc"} Mar 19 15:20:42 crc kubenswrapper[4771]: I0319 15:20:42.370458 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6730def2-f246-44ea-9ee9-93f4e490008d","Type":"ContainerStarted","Data":"ee0f7aacd635440fd47722a932fb07df752eb188e88672ae56dc9d9d9bd0df8d"} Mar 19 15:20:42 crc kubenswrapper[4771]: I0319 15:20:42.370488 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6730def2-f246-44ea-9ee9-93f4e490008d","Type":"ContainerStarted","Data":"697d9972d87841f97f90f55f0213281add336109b44c0917d95d48f781238112"} Mar 19 15:20:42 crc kubenswrapper[4771]: I0319 15:20:42.372166 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n" event={"ID":"3f867f7d-795a-44c9-9d88-eeb13f54bc1c","Type":"ContainerStarted","Data":"9f07d5525082238ef0477c42b71a4c9e55014e404204ded052d82c0b4d828248"} Mar 19 15:20:42 crc kubenswrapper[4771]: I0319 15:20:42.372218 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n" event={"ID":"3f867f7d-795a-44c9-9d88-eeb13f54bc1c","Type":"ContainerStarted","Data":"8565b99f836dbe5cf12691fcc0e35135edea99834e2574faa44d673700bf5e65"} Mar 19 15:20:42 crc kubenswrapper[4771]: I0319 15:20:42.374355 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n" Mar 19 15:20:42 crc kubenswrapper[4771]: I0319 15:20:42.376208 4771 generic.go:334] "Generic (PLEG): container finished" podID="af3ce0f9-bc02-4142-8655-9751fe9197db" containerID="eefa995bf453314bfdaac302525fa3aafee26a1072abb2daf857895c98c1e5d6" exitCode=0 Mar 19 15:20:42 crc kubenswrapper[4771]: I0319 15:20:42.376287 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565558-wvlb8" event={"ID":"af3ce0f9-bc02-4142-8655-9751fe9197db","Type":"ContainerDied","Data":"eefa995bf453314bfdaac302525fa3aafee26a1072abb2daf857895c98c1e5d6"} Mar 19 15:20:42 crc kubenswrapper[4771]: I0319 15:20:42.379310 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n" Mar 19 15:20:42 crc kubenswrapper[4771]: I0319 15:20:42.379348 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565560-5rgwr" event={"ID":"c681a9f8-ad65-46af-b5a2-3ea110cda37f","Type":"ContainerStarted","Data":"8c83bee82a9df53269fc854fe79a2c739db69bbf5a840b57b873b8a056b01e5a"} Mar 19 15:20:42 crc kubenswrapper[4771]: I0319 15:20:42.407437 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=8.407414252 podStartE2EDuration="8.407414252s" podCreationTimestamp="2026-03-19 15:20:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:20:42.406838758 +0000 UTC m=+301.635459990" watchObservedRunningTime="2026-03-19 15:20:42.407414252 +0000 UTC m=+301.636035474" Mar 19 15:20:42 crc kubenswrapper[4771]: I0319 15:20:42.446094 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n" podStartSLOduration=15.446077029 podStartE2EDuration="15.446077029s" podCreationTimestamp="2026-03-19 15:20:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:20:42.424048376 +0000 UTC m=+301.652669578" watchObservedRunningTime="2026-03-19 15:20:42.446077029 +0000 UTC m=+301.674698231" Mar 19 15:20:42 crc kubenswrapper[4771]: I0319 15:20:42.463087 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565560-5rgwr" podStartSLOduration=24.708312246 podStartE2EDuration="42.463072722s" podCreationTimestamp="2026-03-19 15:20:00 +0000 UTC" firstStartedPulling="2026-03-19 15:20:23.249784905 +0000 UTC m=+282.478406137" lastFinishedPulling="2026-03-19 15:20:41.004545411 +0000 UTC m=+300.233166613" observedRunningTime="2026-03-19 15:20:42.46223216 +0000 UTC m=+301.690853352" watchObservedRunningTime="2026-03-19 15:20:42.463072722 +0000 UTC m=+301.691693914" Mar 19 15:20:42 crc kubenswrapper[4771]: I0319 15:20:42.948675 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-18 09:44:33.419104378 +0000 UTC Mar 19 15:20:42 crc kubenswrapper[4771]: I0319 15:20:42.948718 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 5850h23m50.47039106s for next certificate rotation Mar 19 15:20:43 crc kubenswrapper[4771]: I0319 15:20:43.387861 4771 generic.go:334] "Generic (PLEG): container finished" podID="c681a9f8-ad65-46af-b5a2-3ea110cda37f" containerID="8c83bee82a9df53269fc854fe79a2c739db69bbf5a840b57b873b8a056b01e5a" exitCode=0 Mar 19 15:20:43 crc kubenswrapper[4771]: I0319 15:20:43.389263 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565560-5rgwr" event={"ID":"c681a9f8-ad65-46af-b5a2-3ea110cda37f","Type":"ContainerDied","Data":"8c83bee82a9df53269fc854fe79a2c739db69bbf5a840b57b873b8a056b01e5a"} Mar 19 15:20:43 crc kubenswrapper[4771]: I0319 15:20:43.392762 4771 generic.go:334] "Generic (PLEG): container finished" podID="6d94fc0b-9a51-41a1-b346-767fa239b631" containerID="dec947573a9131191b800de798810731e3d72b766abcebb33bf3eabf9b1a456a" exitCode=0 Mar 19 15:20:43 crc kubenswrapper[4771]: I0319 15:20:43.392858 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6j98" event={"ID":"6d94fc0b-9a51-41a1-b346-767fa239b631","Type":"ContainerDied","Data":"dec947573a9131191b800de798810731e3d72b766abcebb33bf3eabf9b1a456a"} Mar 19 15:20:43 crc kubenswrapper[4771]: I0319 15:20:43.397217 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dkz7z" event={"ID":"1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07","Type":"ContainerDied","Data":"ba7700933f4df7669048b42b968e0f31c6b37f46cabdbbbe3dc933d65437035f"} Mar 19 15:20:43 crc kubenswrapper[4771]: I0319 15:20:43.397264 4771 generic.go:334] "Generic (PLEG): container finished" podID="1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07" containerID="ba7700933f4df7669048b42b968e0f31c6b37f46cabdbbbe3dc933d65437035f" exitCode=0 Mar 19 15:20:43 crc kubenswrapper[4771]: I0319 15:20:43.518796 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b59b88d-6b8a-43dd-ae40-3091d533b8ae" path="/var/lib/kubelet/pods/9b59b88d-6b8a-43dd-ae40-3091d533b8ae/volumes" Mar 19 15:20:43 crc kubenswrapper[4771]: I0319 15:20:43.520042 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc023f86-d23c-4ca1-810a-de7ece9bb340" path="/var/lib/kubelet/pods/cc023f86-d23c-4ca1-810a-de7ece9bb340/volumes" Mar 19 15:20:43 crc kubenswrapper[4771]: I0319 15:20:43.728577 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565558-wvlb8" Mar 19 15:20:43 crc kubenswrapper[4771]: I0319 15:20:43.874470 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjb4j\" (UniqueName: \"kubernetes.io/projected/af3ce0f9-bc02-4142-8655-9751fe9197db-kube-api-access-vjb4j\") pod \"af3ce0f9-bc02-4142-8655-9751fe9197db\" (UID: \"af3ce0f9-bc02-4142-8655-9751fe9197db\") " Mar 19 15:20:43 crc kubenswrapper[4771]: I0319 15:20:43.881258 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af3ce0f9-bc02-4142-8655-9751fe9197db-kube-api-access-vjb4j" (OuterVolumeSpecName: "kube-api-access-vjb4j") pod "af3ce0f9-bc02-4142-8655-9751fe9197db" (UID: "af3ce0f9-bc02-4142-8655-9751fe9197db"). InnerVolumeSpecName "kube-api-access-vjb4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:20:43 crc kubenswrapper[4771]: I0319 15:20:43.949848 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-09 12:24:35.345492784 +0000 UTC Mar 19 15:20:43 crc kubenswrapper[4771]: I0319 15:20:43.949896 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6357h3m51.395599815s for next certificate rotation Mar 19 15:20:43 crc kubenswrapper[4771]: I0319 15:20:43.976161 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjb4j\" (UniqueName: \"kubernetes.io/projected/af3ce0f9-bc02-4142-8655-9751fe9197db-kube-api-access-vjb4j\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:44 crc kubenswrapper[4771]: I0319 15:20:44.403637 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565558-wvlb8" event={"ID":"af3ce0f9-bc02-4142-8655-9751fe9197db","Type":"ContainerDied","Data":"a18a109f93aac2a6f097c6a886b1af52bad7df99857ce81938a3b494936e06f9"} Mar 19 15:20:44 crc kubenswrapper[4771]: I0319 15:20:44.403690 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a18a109f93aac2a6f097c6a886b1af52bad7df99857ce81938a3b494936e06f9" Mar 19 15:20:44 crc kubenswrapper[4771]: I0319 15:20:44.403755 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565558-wvlb8" Mar 19 15:20:44 crc kubenswrapper[4771]: I0319 15:20:44.739405 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565560-5rgwr" Mar 19 15:20:44 crc kubenswrapper[4771]: I0319 15:20:44.886550 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5fmh\" (UniqueName: \"kubernetes.io/projected/c681a9f8-ad65-46af-b5a2-3ea110cda37f-kube-api-access-w5fmh\") pod \"c681a9f8-ad65-46af-b5a2-3ea110cda37f\" (UID: \"c681a9f8-ad65-46af-b5a2-3ea110cda37f\") " Mar 19 15:20:44 crc kubenswrapper[4771]: I0319 15:20:44.906871 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c681a9f8-ad65-46af-b5a2-3ea110cda37f-kube-api-access-w5fmh" (OuterVolumeSpecName: "kube-api-access-w5fmh") pod "c681a9f8-ad65-46af-b5a2-3ea110cda37f" (UID: "c681a9f8-ad65-46af-b5a2-3ea110cda37f"). InnerVolumeSpecName "kube-api-access-w5fmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:20:44 crc kubenswrapper[4771]: I0319 15:20:44.988528 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5fmh\" (UniqueName: \"kubernetes.io/projected/c681a9f8-ad65-46af-b5a2-3ea110cda37f-kube-api-access-w5fmh\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:45 crc kubenswrapper[4771]: I0319 15:20:45.409764 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565560-5rgwr" event={"ID":"c681a9f8-ad65-46af-b5a2-3ea110cda37f","Type":"ContainerDied","Data":"2c911b76c113583fe798f561ea0bcc4a56a1f8066fc585375294cdf9107832f4"} Mar 19 15:20:45 crc kubenswrapper[4771]: I0319 15:20:45.409804 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c911b76c113583fe798f561ea0bcc4a56a1f8066fc585375294cdf9107832f4" Mar 19 15:20:45 crc kubenswrapper[4771]: I0319 15:20:45.409836 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565560-5rgwr" Mar 19 15:20:47 crc kubenswrapper[4771]: I0319 15:20:47.421727 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4fmgq" event={"ID":"ae9495d8-bbe9-4f54-8c12-56b9f40530e1","Type":"ContainerStarted","Data":"ed2f2c15294c7f6352d925fb098f33bfbef8ffb328d72ce96a1360db33bc1f80"} Mar 19 15:20:47 crc kubenswrapper[4771]: I0319 15:20:47.424822 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tdfqz" event={"ID":"7f49e754-62f6-4f17-a6ed-fe5e3abe32b4","Type":"ContainerStarted","Data":"06e53e911b793778d34f78086db758bc28dc029bfd033570bcf77fc39703423c"} Mar 19 15:20:47 crc kubenswrapper[4771]: I0319 15:20:47.427452 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6j98" event={"ID":"6d94fc0b-9a51-41a1-b346-767fa239b631","Type":"ContainerStarted","Data":"a62f3fc60a366e767b6a0f9ded31782078e3110062adb178ec9bf703a4c4393b"} Mar 19 15:20:47 crc kubenswrapper[4771]: I0319 15:20:47.429840 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dkz7z" event={"ID":"1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07","Type":"ContainerStarted","Data":"685c5b9513f03af54481654afeec7492661ec4c44a7284aa2c7fb54081650b5a"} Mar 19 15:20:47 crc kubenswrapper[4771]: I0319 15:20:47.440021 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6b9f96b47-8f9cz"] Mar 19 15:20:47 crc kubenswrapper[4771]: I0319 15:20:47.440465 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6b9f96b47-8f9cz" podUID="fb41ac02-a683-418a-8c0a-97d3690cbe7a" containerName="controller-manager" containerID="cri-o://62706817bee080bfe790b5184eac170a2409c42d60a52506274ba77459ffb72f" gracePeriod=30 Mar 19 15:20:47 crc kubenswrapper[4771]: I0319 15:20:47.456784 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n"] Mar 19 15:20:47 crc kubenswrapper[4771]: I0319 15:20:47.456976 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n" podUID="3f867f7d-795a-44c9-9d88-eeb13f54bc1c" containerName="route-controller-manager" containerID="cri-o://9f07d5525082238ef0477c42b71a4c9e55014e404204ded052d82c0b4d828248" gracePeriod=30 Mar 19 15:20:47 crc kubenswrapper[4771]: I0319 15:20:47.490081 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tdfqz" podStartSLOduration=28.374911304 podStartE2EDuration="52.490066605s" podCreationTimestamp="2026-03-19 15:19:55 +0000 UTC" firstStartedPulling="2026-03-19 15:20:22.954772983 +0000 UTC m=+282.183394185" lastFinishedPulling="2026-03-19 15:20:47.069928264 +0000 UTC m=+306.298549486" observedRunningTime="2026-03-19 15:20:47.489050459 +0000 UTC m=+306.717671681" watchObservedRunningTime="2026-03-19 15:20:47.490066605 +0000 UTC m=+306.718687807" Mar 19 15:20:47 crc kubenswrapper[4771]: I0319 15:20:47.543618 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r6j98" podStartSLOduration=1.845803511 podStartE2EDuration="54.54359916s" podCreationTimestamp="2026-03-19 15:19:53 +0000 UTC" firstStartedPulling="2026-03-19 15:19:54.349831865 +0000 UTC m=+253.578453067" lastFinishedPulling="2026-03-19 15:20:47.047627494 +0000 UTC m=+306.276248716" observedRunningTime="2026-03-19 15:20:47.523173367 +0000 UTC m=+306.751794569" watchObservedRunningTime="2026-03-19 15:20:47.54359916 +0000 UTC m=+306.772220362" Mar 19 15:20:47 crc kubenswrapper[4771]: I0319 15:20:47.929904 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n" Mar 19 15:20:47 crc kubenswrapper[4771]: I0319 15:20:47.947970 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dkz7z" podStartSLOduration=30.877683725 podStartE2EDuration="51.947948458s" podCreationTimestamp="2026-03-19 15:19:56 +0000 UTC" firstStartedPulling="2026-03-19 15:20:26.107948362 +0000 UTC m=+285.336569574" lastFinishedPulling="2026-03-19 15:20:47.178213095 +0000 UTC m=+306.406834307" observedRunningTime="2026-03-19 15:20:47.544558804 +0000 UTC m=+306.773180006" watchObservedRunningTime="2026-03-19 15:20:47.947948458 +0000 UTC m=+307.176569660" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.017997 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b9f96b47-8f9cz" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.029440 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f867f7d-795a-44c9-9d88-eeb13f54bc1c-client-ca\") pod \"3f867f7d-795a-44c9-9d88-eeb13f54bc1c\" (UID: \"3f867f7d-795a-44c9-9d88-eeb13f54bc1c\") " Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.029492 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f867f7d-795a-44c9-9d88-eeb13f54bc1c-config\") pod \"3f867f7d-795a-44c9-9d88-eeb13f54bc1c\" (UID: \"3f867f7d-795a-44c9-9d88-eeb13f54bc1c\") " Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.029555 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f867f7d-795a-44c9-9d88-eeb13f54bc1c-serving-cert\") pod \"3f867f7d-795a-44c9-9d88-eeb13f54bc1c\" (UID: \"3f867f7d-795a-44c9-9d88-eeb13f54bc1c\") " Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.029613 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49bzf\" (UniqueName: \"kubernetes.io/projected/3f867f7d-795a-44c9-9d88-eeb13f54bc1c-kube-api-access-49bzf\") pod \"3f867f7d-795a-44c9-9d88-eeb13f54bc1c\" (UID: \"3f867f7d-795a-44c9-9d88-eeb13f54bc1c\") " Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.030553 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f867f7d-795a-44c9-9d88-eeb13f54bc1c-client-ca" (OuterVolumeSpecName: "client-ca") pod "3f867f7d-795a-44c9-9d88-eeb13f54bc1c" (UID: "3f867f7d-795a-44c9-9d88-eeb13f54bc1c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.030622 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f867f7d-795a-44c9-9d88-eeb13f54bc1c-config" (OuterVolumeSpecName: "config") pod "3f867f7d-795a-44c9-9d88-eeb13f54bc1c" (UID: "3f867f7d-795a-44c9-9d88-eeb13f54bc1c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.037183 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f867f7d-795a-44c9-9d88-eeb13f54bc1c-kube-api-access-49bzf" (OuterVolumeSpecName: "kube-api-access-49bzf") pod "3f867f7d-795a-44c9-9d88-eeb13f54bc1c" (UID: "3f867f7d-795a-44c9-9d88-eeb13f54bc1c"). InnerVolumeSpecName "kube-api-access-49bzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.037471 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f867f7d-795a-44c9-9d88-eeb13f54bc1c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3f867f7d-795a-44c9-9d88-eeb13f54bc1c" (UID: "3f867f7d-795a-44c9-9d88-eeb13f54bc1c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.130603 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb41ac02-a683-418a-8c0a-97d3690cbe7a-serving-cert\") pod \"fb41ac02-a683-418a-8c0a-97d3690cbe7a\" (UID: \"fb41ac02-a683-418a-8c0a-97d3690cbe7a\") " Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.130701 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlwtv\" (UniqueName: \"kubernetes.io/projected/fb41ac02-a683-418a-8c0a-97d3690cbe7a-kube-api-access-mlwtv\") pod \"fb41ac02-a683-418a-8c0a-97d3690cbe7a\" (UID: \"fb41ac02-a683-418a-8c0a-97d3690cbe7a\") " Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.130724 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb41ac02-a683-418a-8c0a-97d3690cbe7a-proxy-ca-bundles\") pod \"fb41ac02-a683-418a-8c0a-97d3690cbe7a\" (UID: \"fb41ac02-a683-418a-8c0a-97d3690cbe7a\") " Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.130745 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb41ac02-a683-418a-8c0a-97d3690cbe7a-client-ca\") pod \"fb41ac02-a683-418a-8c0a-97d3690cbe7a\" (UID: \"fb41ac02-a683-418a-8c0a-97d3690cbe7a\") " Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.130802 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb41ac02-a683-418a-8c0a-97d3690cbe7a-config\") pod \"fb41ac02-a683-418a-8c0a-97d3690cbe7a\" (UID: \"fb41ac02-a683-418a-8c0a-97d3690cbe7a\") " Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.131083 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f867f7d-795a-44c9-9d88-eeb13f54bc1c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.131101 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f867f7d-795a-44c9-9d88-eeb13f54bc1c-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.131109 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f867f7d-795a-44c9-9d88-eeb13f54bc1c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.131118 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49bzf\" (UniqueName: \"kubernetes.io/projected/3f867f7d-795a-44c9-9d88-eeb13f54bc1c-kube-api-access-49bzf\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.131561 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb41ac02-a683-418a-8c0a-97d3690cbe7a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "fb41ac02-a683-418a-8c0a-97d3690cbe7a" (UID: "fb41ac02-a683-418a-8c0a-97d3690cbe7a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.131604 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb41ac02-a683-418a-8c0a-97d3690cbe7a-config" (OuterVolumeSpecName: "config") pod "fb41ac02-a683-418a-8c0a-97d3690cbe7a" (UID: "fb41ac02-a683-418a-8c0a-97d3690cbe7a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.131638 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb41ac02-a683-418a-8c0a-97d3690cbe7a-client-ca" (OuterVolumeSpecName: "client-ca") pod "fb41ac02-a683-418a-8c0a-97d3690cbe7a" (UID: "fb41ac02-a683-418a-8c0a-97d3690cbe7a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.137658 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb41ac02-a683-418a-8c0a-97d3690cbe7a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fb41ac02-a683-418a-8c0a-97d3690cbe7a" (UID: "fb41ac02-a683-418a-8c0a-97d3690cbe7a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.137652 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb41ac02-a683-418a-8c0a-97d3690cbe7a-kube-api-access-mlwtv" (OuterVolumeSpecName: "kube-api-access-mlwtv") pod "fb41ac02-a683-418a-8c0a-97d3690cbe7a" (UID: "fb41ac02-a683-418a-8c0a-97d3690cbe7a"). InnerVolumeSpecName "kube-api-access-mlwtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.232023 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb41ac02-a683-418a-8c0a-97d3690cbe7a-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.232057 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb41ac02-a683-418a-8c0a-97d3690cbe7a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.232067 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlwtv\" (UniqueName: \"kubernetes.io/projected/fb41ac02-a683-418a-8c0a-97d3690cbe7a-kube-api-access-mlwtv\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.232076 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb41ac02-a683-418a-8c0a-97d3690cbe7a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.232084 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb41ac02-a683-418a-8c0a-97d3690cbe7a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.435807 4771 generic.go:334] "Generic (PLEG): container finished" podID="fb41ac02-a683-418a-8c0a-97d3690cbe7a" containerID="62706817bee080bfe790b5184eac170a2409c42d60a52506274ba77459ffb72f" exitCode=0 Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.436171 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b9f96b47-8f9cz" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.436562 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b9f96b47-8f9cz" event={"ID":"fb41ac02-a683-418a-8c0a-97d3690cbe7a","Type":"ContainerDied","Data":"62706817bee080bfe790b5184eac170a2409c42d60a52506274ba77459ffb72f"} Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.436617 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b9f96b47-8f9cz" event={"ID":"fb41ac02-a683-418a-8c0a-97d3690cbe7a","Type":"ContainerDied","Data":"66b8fb6a57ce4735789db82deb2fb82b704c7545f5eb74e0dfb1afe052b4480c"} Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.436643 4771 scope.go:117] "RemoveContainer" containerID="62706817bee080bfe790b5184eac170a2409c42d60a52506274ba77459ffb72f" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.441978 4771 generic.go:334] "Generic (PLEG): container finished" podID="ae9495d8-bbe9-4f54-8c12-56b9f40530e1" containerID="ed2f2c15294c7f6352d925fb098f33bfbef8ffb328d72ce96a1360db33bc1f80" exitCode=0 Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.442034 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4fmgq" event={"ID":"ae9495d8-bbe9-4f54-8c12-56b9f40530e1","Type":"ContainerDied","Data":"ed2f2c15294c7f6352d925fb098f33bfbef8ffb328d72ce96a1360db33bc1f80"} Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.450312 4771 generic.go:334] "Generic (PLEG): container finished" podID="3f867f7d-795a-44c9-9d88-eeb13f54bc1c" containerID="9f07d5525082238ef0477c42b71a4c9e55014e404204ded052d82c0b4d828248" exitCode=0 Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.450351 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.450548 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n" event={"ID":"3f867f7d-795a-44c9-9d88-eeb13f54bc1c","Type":"ContainerDied","Data":"9f07d5525082238ef0477c42b71a4c9e55014e404204ded052d82c0b4d828248"} Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.450599 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n" event={"ID":"3f867f7d-795a-44c9-9d88-eeb13f54bc1c","Type":"ContainerDied","Data":"8565b99f836dbe5cf12691fcc0e35135edea99834e2574faa44d673700bf5e65"} Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.462364 4771 scope.go:117] "RemoveContainer" containerID="62706817bee080bfe790b5184eac170a2409c42d60a52506274ba77459ffb72f" Mar 19 15:20:48 crc kubenswrapper[4771]: E0319 15:20:48.463126 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62706817bee080bfe790b5184eac170a2409c42d60a52506274ba77459ffb72f\": container with ID starting with 62706817bee080bfe790b5184eac170a2409c42d60a52506274ba77459ffb72f not found: ID does not exist" containerID="62706817bee080bfe790b5184eac170a2409c42d60a52506274ba77459ffb72f" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.463174 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62706817bee080bfe790b5184eac170a2409c42d60a52506274ba77459ffb72f"} err="failed to get container status \"62706817bee080bfe790b5184eac170a2409c42d60a52506274ba77459ffb72f\": rpc error: code = NotFound desc = could not find container \"62706817bee080bfe790b5184eac170a2409c42d60a52506274ba77459ffb72f\": container with ID starting with 62706817bee080bfe790b5184eac170a2409c42d60a52506274ba77459ffb72f not found: ID does not exist" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.463197 4771 scope.go:117] "RemoveContainer" containerID="9f07d5525082238ef0477c42b71a4c9e55014e404204ded052d82c0b4d828248" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.478359 4771 scope.go:117] "RemoveContainer" containerID="9f07d5525082238ef0477c42b71a4c9e55014e404204ded052d82c0b4d828248" Mar 19 15:20:48 crc kubenswrapper[4771]: E0319 15:20:48.478767 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f07d5525082238ef0477c42b71a4c9e55014e404204ded052d82c0b4d828248\": container with ID starting with 9f07d5525082238ef0477c42b71a4c9e55014e404204ded052d82c0b4d828248 not found: ID does not exist" containerID="9f07d5525082238ef0477c42b71a4c9e55014e404204ded052d82c0b4d828248" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.478794 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f07d5525082238ef0477c42b71a4c9e55014e404204ded052d82c0b4d828248"} err="failed to get container status \"9f07d5525082238ef0477c42b71a4c9e55014e404204ded052d82c0b4d828248\": rpc error: code = NotFound desc = could not find container \"9f07d5525082238ef0477c42b71a4c9e55014e404204ded052d82c0b4d828248\": container with ID starting with 9f07d5525082238ef0477c42b71a4c9e55014e404204ded052d82c0b4d828248 not found: ID does not exist" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.487335 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6b9f96b47-8f9cz"] Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.494096 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6b9f96b47-8f9cz"] Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.497920 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n"] Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.505781 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9ddc4fc97-wb22n"] Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.991299 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9"] Mar 19 15:20:48 crc kubenswrapper[4771]: E0319 15:20:48.991662 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b59b88d-6b8a-43dd-ae40-3091d533b8ae" containerName="registry-server" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.991695 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b59b88d-6b8a-43dd-ae40-3091d533b8ae" containerName="registry-server" Mar 19 15:20:48 crc kubenswrapper[4771]: E0319 15:20:48.991718 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f867f7d-795a-44c9-9d88-eeb13f54bc1c" containerName="route-controller-manager" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.991727 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f867f7d-795a-44c9-9d88-eeb13f54bc1c" containerName="route-controller-manager" Mar 19 15:20:48 crc kubenswrapper[4771]: E0319 15:20:48.991739 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b59b88d-6b8a-43dd-ae40-3091d533b8ae" containerName="extract-content" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.991746 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b59b88d-6b8a-43dd-ae40-3091d533b8ae" containerName="extract-content" Mar 19 15:20:48 crc kubenswrapper[4771]: E0319 15:20:48.991759 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c681a9f8-ad65-46af-b5a2-3ea110cda37f" containerName="oc" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.991768 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c681a9f8-ad65-46af-b5a2-3ea110cda37f" containerName="oc" Mar 19 15:20:48 crc kubenswrapper[4771]: E0319 15:20:48.991790 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc023f86-d23c-4ca1-810a-de7ece9bb340" containerName="extract-content" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.991799 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc023f86-d23c-4ca1-810a-de7ece9bb340" containerName="extract-content" Mar 19 15:20:48 crc kubenswrapper[4771]: E0319 15:20:48.991813 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b59b88d-6b8a-43dd-ae40-3091d533b8ae" containerName="extract-utilities" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.991822 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b59b88d-6b8a-43dd-ae40-3091d533b8ae" containerName="extract-utilities" Mar 19 15:20:48 crc kubenswrapper[4771]: E0319 15:20:48.991836 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="326c866a-522f-4c4c-91ee-7225c1a7d537" containerName="pruner" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.991844 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="326c866a-522f-4c4c-91ee-7225c1a7d537" containerName="pruner" Mar 19 15:20:48 crc kubenswrapper[4771]: E0319 15:20:48.991856 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc023f86-d23c-4ca1-810a-de7ece9bb340" containerName="extract-utilities" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.991865 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc023f86-d23c-4ca1-810a-de7ece9bb340" containerName="extract-utilities" Mar 19 15:20:48 crc kubenswrapper[4771]: E0319 15:20:48.991875 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc023f86-d23c-4ca1-810a-de7ece9bb340" containerName="registry-server" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.991882 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc023f86-d23c-4ca1-810a-de7ece9bb340" containerName="registry-server" Mar 19 15:20:48 crc kubenswrapper[4771]: E0319 15:20:48.991893 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb41ac02-a683-418a-8c0a-97d3690cbe7a" containerName="controller-manager" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.991900 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb41ac02-a683-418a-8c0a-97d3690cbe7a" containerName="controller-manager" Mar 19 15:20:48 crc kubenswrapper[4771]: E0319 15:20:48.991913 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af3ce0f9-bc02-4142-8655-9751fe9197db" containerName="oc" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.991925 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="af3ce0f9-bc02-4142-8655-9751fe9197db" containerName="oc" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.992107 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb41ac02-a683-418a-8c0a-97d3690cbe7a" containerName="controller-manager" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.992122 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="af3ce0f9-bc02-4142-8655-9751fe9197db" containerName="oc" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.992137 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b59b88d-6b8a-43dd-ae40-3091d533b8ae" containerName="registry-server" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.992148 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c681a9f8-ad65-46af-b5a2-3ea110cda37f" containerName="oc" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.992159 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc023f86-d23c-4ca1-810a-de7ece9bb340" containerName="registry-server" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.992172 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f867f7d-795a-44c9-9d88-eeb13f54bc1c" containerName="route-controller-manager" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.992180 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="326c866a-522f-4c4c-91ee-7225c1a7d537" containerName="pruner" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.992727 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9" Mar 19 15:20:48 crc kubenswrapper[4771]: I0319 15:20:48.995627 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.001045 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.001869 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c"] Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.003145 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.004675 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.004872 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.005346 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.005347 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.009237 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.010429 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.010680 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.011339 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.011544 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.017277 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.017295 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.042449 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c"] Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.046925 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9"] Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.144006 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mr59\" (UniqueName: \"kubernetes.io/projected/f0dbfc2c-0887-43ae-91e7-41f38804b329-kube-api-access-8mr59\") pod \"controller-manager-6c9fbfc69d-ztbf9\" (UID: \"f0dbfc2c-0887-43ae-91e7-41f38804b329\") " pod="openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.144384 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f0dbfc2c-0887-43ae-91e7-41f38804b329-client-ca\") pod \"controller-manager-6c9fbfc69d-ztbf9\" (UID: \"f0dbfc2c-0887-43ae-91e7-41f38804b329\") " pod="openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.144547 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0dbfc2c-0887-43ae-91e7-41f38804b329-serving-cert\") pod \"controller-manager-6c9fbfc69d-ztbf9\" (UID: \"f0dbfc2c-0887-43ae-91e7-41f38804b329\") " pod="openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.144635 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0dbfc2c-0887-43ae-91e7-41f38804b329-config\") pod \"controller-manager-6c9fbfc69d-ztbf9\" (UID: \"f0dbfc2c-0887-43ae-91e7-41f38804b329\") " pod="openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.144785 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c3bf28f-7584-40d7-bff1-afd0078778cf-client-ca\") pod \"route-controller-manager-8487dc6694-djh8c\" (UID: \"4c3bf28f-7584-40d7-bff1-afd0078778cf\") " pod="openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.144844 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c3bf28f-7584-40d7-bff1-afd0078778cf-config\") pod \"route-controller-manager-8487dc6694-djh8c\" (UID: \"4c3bf28f-7584-40d7-bff1-afd0078778cf\") " pod="openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.145000 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jm5p\" (UniqueName: \"kubernetes.io/projected/4c3bf28f-7584-40d7-bff1-afd0078778cf-kube-api-access-6jm5p\") pod \"route-controller-manager-8487dc6694-djh8c\" (UID: \"4c3bf28f-7584-40d7-bff1-afd0078778cf\") " pod="openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.145096 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f0dbfc2c-0887-43ae-91e7-41f38804b329-proxy-ca-bundles\") pod \"controller-manager-6c9fbfc69d-ztbf9\" (UID: \"f0dbfc2c-0887-43ae-91e7-41f38804b329\") " pod="openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.145142 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c3bf28f-7584-40d7-bff1-afd0078778cf-serving-cert\") pod \"route-controller-manager-8487dc6694-djh8c\" (UID: \"4c3bf28f-7584-40d7-bff1-afd0078778cf\") " pod="openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.246027 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0dbfc2c-0887-43ae-91e7-41f38804b329-config\") pod \"controller-manager-6c9fbfc69d-ztbf9\" (UID: \"f0dbfc2c-0887-43ae-91e7-41f38804b329\") " pod="openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.246118 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c3bf28f-7584-40d7-bff1-afd0078778cf-client-ca\") pod \"route-controller-manager-8487dc6694-djh8c\" (UID: \"4c3bf28f-7584-40d7-bff1-afd0078778cf\") " pod="openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.246141 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c3bf28f-7584-40d7-bff1-afd0078778cf-config\") pod \"route-controller-manager-8487dc6694-djh8c\" (UID: \"4c3bf28f-7584-40d7-bff1-afd0078778cf\") " pod="openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.246194 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jm5p\" (UniqueName: \"kubernetes.io/projected/4c3bf28f-7584-40d7-bff1-afd0078778cf-kube-api-access-6jm5p\") pod \"route-controller-manager-8487dc6694-djh8c\" (UID: \"4c3bf28f-7584-40d7-bff1-afd0078778cf\") " pod="openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.246230 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f0dbfc2c-0887-43ae-91e7-41f38804b329-proxy-ca-bundles\") pod \"controller-manager-6c9fbfc69d-ztbf9\" (UID: \"f0dbfc2c-0887-43ae-91e7-41f38804b329\") " pod="openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.246258 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c3bf28f-7584-40d7-bff1-afd0078778cf-serving-cert\") pod \"route-controller-manager-8487dc6694-djh8c\" (UID: \"4c3bf28f-7584-40d7-bff1-afd0078778cf\") " pod="openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.246310 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mr59\" (UniqueName: \"kubernetes.io/projected/f0dbfc2c-0887-43ae-91e7-41f38804b329-kube-api-access-8mr59\") pod \"controller-manager-6c9fbfc69d-ztbf9\" (UID: \"f0dbfc2c-0887-43ae-91e7-41f38804b329\") " pod="openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.246354 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f0dbfc2c-0887-43ae-91e7-41f38804b329-client-ca\") pod \"controller-manager-6c9fbfc69d-ztbf9\" (UID: \"f0dbfc2c-0887-43ae-91e7-41f38804b329\") " pod="openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.246382 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0dbfc2c-0887-43ae-91e7-41f38804b329-serving-cert\") pod \"controller-manager-6c9fbfc69d-ztbf9\" (UID: \"f0dbfc2c-0887-43ae-91e7-41f38804b329\") " pod="openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.247619 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f0dbfc2c-0887-43ae-91e7-41f38804b329-client-ca\") pod \"controller-manager-6c9fbfc69d-ztbf9\" (UID: \"f0dbfc2c-0887-43ae-91e7-41f38804b329\") " pod="openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.247879 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f0dbfc2c-0887-43ae-91e7-41f38804b329-proxy-ca-bundles\") pod \"controller-manager-6c9fbfc69d-ztbf9\" (UID: \"f0dbfc2c-0887-43ae-91e7-41f38804b329\") " pod="openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.249064 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0dbfc2c-0887-43ae-91e7-41f38804b329-config\") pod \"controller-manager-6c9fbfc69d-ztbf9\" (UID: \"f0dbfc2c-0887-43ae-91e7-41f38804b329\") " pod="openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.249144 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c3bf28f-7584-40d7-bff1-afd0078778cf-config\") pod \"route-controller-manager-8487dc6694-djh8c\" (UID: \"4c3bf28f-7584-40d7-bff1-afd0078778cf\") " pod="openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.249482 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c3bf28f-7584-40d7-bff1-afd0078778cf-client-ca\") pod \"route-controller-manager-8487dc6694-djh8c\" (UID: \"4c3bf28f-7584-40d7-bff1-afd0078778cf\") " pod="openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.253260 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c3bf28f-7584-40d7-bff1-afd0078778cf-serving-cert\") pod \"route-controller-manager-8487dc6694-djh8c\" (UID: \"4c3bf28f-7584-40d7-bff1-afd0078778cf\") " pod="openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.253496 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0dbfc2c-0887-43ae-91e7-41f38804b329-serving-cert\") pod \"controller-manager-6c9fbfc69d-ztbf9\" (UID: \"f0dbfc2c-0887-43ae-91e7-41f38804b329\") " pod="openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.262492 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jm5p\" (UniqueName: \"kubernetes.io/projected/4c3bf28f-7584-40d7-bff1-afd0078778cf-kube-api-access-6jm5p\") pod \"route-controller-manager-8487dc6694-djh8c\" (UID: \"4c3bf28f-7584-40d7-bff1-afd0078778cf\") " pod="openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.267836 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mr59\" (UniqueName: \"kubernetes.io/projected/f0dbfc2c-0887-43ae-91e7-41f38804b329-kube-api-access-8mr59\") pod \"controller-manager-6c9fbfc69d-ztbf9\" (UID: \"f0dbfc2c-0887-43ae-91e7-41f38804b329\") " pod="openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.313715 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.340067 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.522611 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f867f7d-795a-44c9-9d88-eeb13f54bc1c" path="/var/lib/kubelet/pods/3f867f7d-795a-44c9-9d88-eeb13f54bc1c/volumes" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.523738 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb41ac02-a683-418a-8c0a-97d3690cbe7a" path="/var/lib/kubelet/pods/fb41ac02-a683-418a-8c0a-97d3690cbe7a/volumes" Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.566638 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9"] Mar 19 15:20:49 crc kubenswrapper[4771]: I0319 15:20:49.615123 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c"] Mar 19 15:20:49 crc kubenswrapper[4771]: W0319 15:20:49.637683 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c3bf28f_7584_40d7_bff1_afd0078778cf.slice/crio-1d02c0f437872e6b0e7b869e497941bf79d48c02482a8940ef63978f5ddb3fe0 WatchSource:0}: Error finding container 1d02c0f437872e6b0e7b869e497941bf79d48c02482a8940ef63978f5ddb3fe0: Status 404 returned error can't find the container with id 1d02c0f437872e6b0e7b869e497941bf79d48c02482a8940ef63978f5ddb3fe0 Mar 19 15:20:50 crc kubenswrapper[4771]: I0319 15:20:50.477291 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c" event={"ID":"4c3bf28f-7584-40d7-bff1-afd0078778cf","Type":"ContainerStarted","Data":"a9f6e56f91e4934e289f2fedc734d017159cc650b7a941dd6d334667ecf1eea2"} Mar 19 15:20:50 crc kubenswrapper[4771]: I0319 15:20:50.477333 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c" event={"ID":"4c3bf28f-7584-40d7-bff1-afd0078778cf","Type":"ContainerStarted","Data":"1d02c0f437872e6b0e7b869e497941bf79d48c02482a8940ef63978f5ddb3fe0"} Mar 19 15:20:50 crc kubenswrapper[4771]: I0319 15:20:50.477468 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c" Mar 19 15:20:50 crc kubenswrapper[4771]: I0319 15:20:50.480243 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9" event={"ID":"f0dbfc2c-0887-43ae-91e7-41f38804b329","Type":"ContainerStarted","Data":"fd19998a1cc897f296a232dd08f82a6c21905755d7b26a408b471f2658478638"} Mar 19 15:20:50 crc kubenswrapper[4771]: I0319 15:20:50.480271 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9" event={"ID":"f0dbfc2c-0887-43ae-91e7-41f38804b329","Type":"ContainerStarted","Data":"85ad19553e12a13d7c70ecad038ae63376024606effe59c254f908f3eaf54e5e"} Mar 19 15:20:50 crc kubenswrapper[4771]: I0319 15:20:50.480902 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9" Mar 19 15:20:50 crc kubenswrapper[4771]: I0319 15:20:50.482634 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4fmgq" event={"ID":"ae9495d8-bbe9-4f54-8c12-56b9f40530e1","Type":"ContainerStarted","Data":"944a773e6fac49c9e6a8d813447714711db4106a891c13821d1cc90387016f47"} Mar 19 15:20:50 crc kubenswrapper[4771]: I0319 15:20:50.486682 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9" Mar 19 15:20:50 crc kubenswrapper[4771]: I0319 15:20:50.514679 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c" podStartSLOduration=3.514663317 podStartE2EDuration="3.514663317s" podCreationTimestamp="2026-03-19 15:20:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:20:50.494873311 +0000 UTC m=+309.723494513" watchObservedRunningTime="2026-03-19 15:20:50.514663317 +0000 UTC m=+309.743284519" Mar 19 15:20:50 crc kubenswrapper[4771]: I0319 15:20:50.531159 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9" podStartSLOduration=3.531141755 podStartE2EDuration="3.531141755s" podCreationTimestamp="2026-03-19 15:20:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:20:50.53016089 +0000 UTC m=+309.758782092" watchObservedRunningTime="2026-03-19 15:20:50.531141755 +0000 UTC m=+309.759762957" Mar 19 15:20:50 crc kubenswrapper[4771]: I0319 15:20:50.531560 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4fmgq" podStartSLOduration=3.557903865 podStartE2EDuration="58.531555176s" podCreationTimestamp="2026-03-19 15:19:52 +0000 UTC" firstStartedPulling="2026-03-19 15:19:54.349282572 +0000 UTC m=+253.577903774" lastFinishedPulling="2026-03-19 15:20:49.322933883 +0000 UTC m=+308.551555085" observedRunningTime="2026-03-19 15:20:50.512021818 +0000 UTC m=+309.740643010" watchObservedRunningTime="2026-03-19 15:20:50.531555176 +0000 UTC m=+309.760176378" Mar 19 15:20:50 crc kubenswrapper[4771]: I0319 15:20:50.791656 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c" Mar 19 15:20:53 crc kubenswrapper[4771]: I0319 15:20:53.027572 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:20:53 crc kubenswrapper[4771]: I0319 15:20:53.028289 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:20:53 crc kubenswrapper[4771]: I0319 15:20:53.028373 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" Mar 19 15:20:53 crc kubenswrapper[4771]: I0319 15:20:53.029198 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93"} pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 15:20:53 crc kubenswrapper[4771]: I0319 15:20:53.029273 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" containerID="cri-o://505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93" gracePeriod=600 Mar 19 15:20:53 crc kubenswrapper[4771]: I0319 15:20:53.211758 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4fmgq" Mar 19 15:20:53 crc kubenswrapper[4771]: I0319 15:20:53.212548 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4fmgq" Mar 19 15:20:53 crc kubenswrapper[4771]: I0319 15:20:53.263105 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4fmgq" Mar 19 15:20:53 crc kubenswrapper[4771]: I0319 15:20:53.415287 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r6j98" Mar 19 15:20:53 crc kubenswrapper[4771]: I0319 15:20:53.415332 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r6j98" Mar 19 15:20:53 crc kubenswrapper[4771]: I0319 15:20:53.458065 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r6j98" Mar 19 15:20:53 crc kubenswrapper[4771]: I0319 15:20:53.535050 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r6j98" Mar 19 15:20:54 crc kubenswrapper[4771]: I0319 15:20:54.503911 4771 generic.go:334] "Generic (PLEG): container finished" podID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerID="505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93" exitCode=0 Mar 19 15:20:54 crc kubenswrapper[4771]: I0319 15:20:54.504030 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" event={"ID":"f2b6e948-bbef-4217-b0eb-4cdbf711037c","Type":"ContainerDied","Data":"505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93"} Mar 19 15:20:54 crc kubenswrapper[4771]: I0319 15:20:54.541968 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4fmgq" Mar 19 15:20:55 crc kubenswrapper[4771]: I0319 15:20:55.525386 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" event={"ID":"f2b6e948-bbef-4217-b0eb-4cdbf711037c","Type":"ContainerStarted","Data":"c4dbfc80f1f21c45267b8baa63986792b0ac71a0dd8823637031f7df0184802e"} Mar 19 15:20:55 crc kubenswrapper[4771]: I0319 15:20:55.779398 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r6j98"] Mar 19 15:20:55 crc kubenswrapper[4771]: I0319 15:20:55.779616 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r6j98" podUID="6d94fc0b-9a51-41a1-b346-767fa239b631" containerName="registry-server" containerID="cri-o://a62f3fc60a366e767b6a0f9ded31782078e3110062adb178ec9bf703a4c4393b" gracePeriod=2 Mar 19 15:20:56 crc kubenswrapper[4771]: I0319 15:20:56.210672 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tdfqz" Mar 19 15:20:56 crc kubenswrapper[4771]: I0319 15:20:56.211047 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tdfqz" Mar 19 15:20:56 crc kubenswrapper[4771]: I0319 15:20:56.251817 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tdfqz" Mar 19 15:20:56 crc kubenswrapper[4771]: I0319 15:20:56.532518 4771 generic.go:334] "Generic (PLEG): container finished" podID="6d94fc0b-9a51-41a1-b346-767fa239b631" containerID="a62f3fc60a366e767b6a0f9ded31782078e3110062adb178ec9bf703a4c4393b" exitCode=0 Mar 19 15:20:56 crc kubenswrapper[4771]: I0319 15:20:56.533123 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6j98" event={"ID":"6d94fc0b-9a51-41a1-b346-767fa239b631","Type":"ContainerDied","Data":"a62f3fc60a366e767b6a0f9ded31782078e3110062adb178ec9bf703a4c4393b"} Mar 19 15:20:56 crc kubenswrapper[4771]: I0319 15:20:56.592512 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tdfqz" Mar 19 15:20:56 crc kubenswrapper[4771]: I0319 15:20:56.614975 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dkz7z" Mar 19 15:20:56 crc kubenswrapper[4771]: I0319 15:20:56.615221 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dkz7z" Mar 19 15:20:56 crc kubenswrapper[4771]: I0319 15:20:56.705043 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dkz7z" Mar 19 15:20:57 crc kubenswrapper[4771]: I0319 15:20:57.289602 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r6j98" Mar 19 15:20:57 crc kubenswrapper[4771]: I0319 15:20:57.454274 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d94fc0b-9a51-41a1-b346-767fa239b631-catalog-content\") pod \"6d94fc0b-9a51-41a1-b346-767fa239b631\" (UID: \"6d94fc0b-9a51-41a1-b346-767fa239b631\") " Mar 19 15:20:57 crc kubenswrapper[4771]: I0319 15:20:57.454365 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b68ws\" (UniqueName: \"kubernetes.io/projected/6d94fc0b-9a51-41a1-b346-767fa239b631-kube-api-access-b68ws\") pod \"6d94fc0b-9a51-41a1-b346-767fa239b631\" (UID: \"6d94fc0b-9a51-41a1-b346-767fa239b631\") " Mar 19 15:20:57 crc kubenswrapper[4771]: I0319 15:20:57.454422 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d94fc0b-9a51-41a1-b346-767fa239b631-utilities\") pod \"6d94fc0b-9a51-41a1-b346-767fa239b631\" (UID: \"6d94fc0b-9a51-41a1-b346-767fa239b631\") " Mar 19 15:20:57 crc kubenswrapper[4771]: I0319 15:20:57.455495 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d94fc0b-9a51-41a1-b346-767fa239b631-utilities" (OuterVolumeSpecName: "utilities") pod "6d94fc0b-9a51-41a1-b346-767fa239b631" (UID: "6d94fc0b-9a51-41a1-b346-767fa239b631"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:20:57 crc kubenswrapper[4771]: I0319 15:20:57.463353 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d94fc0b-9a51-41a1-b346-767fa239b631-kube-api-access-b68ws" (OuterVolumeSpecName: "kube-api-access-b68ws") pod "6d94fc0b-9a51-41a1-b346-767fa239b631" (UID: "6d94fc0b-9a51-41a1-b346-767fa239b631"). InnerVolumeSpecName "kube-api-access-b68ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:20:57 crc kubenswrapper[4771]: I0319 15:20:57.542126 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6j98" event={"ID":"6d94fc0b-9a51-41a1-b346-767fa239b631","Type":"ContainerDied","Data":"69d64d35df41d0ecb191c844883fbd0ea9d72f539519d976a24db24cb53d4462"} Mar 19 15:20:57 crc kubenswrapper[4771]: I0319 15:20:57.542201 4771 scope.go:117] "RemoveContainer" containerID="a62f3fc60a366e767b6a0f9ded31782078e3110062adb178ec9bf703a4c4393b" Mar 19 15:20:57 crc kubenswrapper[4771]: I0319 15:20:57.542267 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d94fc0b-9a51-41a1-b346-767fa239b631-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d94fc0b-9a51-41a1-b346-767fa239b631" (UID: "6d94fc0b-9a51-41a1-b346-767fa239b631"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:20:57 crc kubenswrapper[4771]: I0319 15:20:57.542263 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r6j98" Mar 19 15:20:57 crc kubenswrapper[4771]: I0319 15:20:57.556051 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d94fc0b-9a51-41a1-b346-767fa239b631-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:57 crc kubenswrapper[4771]: I0319 15:20:57.556096 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b68ws\" (UniqueName: \"kubernetes.io/projected/6d94fc0b-9a51-41a1-b346-767fa239b631-kube-api-access-b68ws\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:57 crc kubenswrapper[4771]: I0319 15:20:57.556116 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d94fc0b-9a51-41a1-b346-767fa239b631-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 15:20:57 crc kubenswrapper[4771]: I0319 15:20:57.567148 4771 scope.go:117] "RemoveContainer" containerID="dec947573a9131191b800de798810731e3d72b766abcebb33bf3eabf9b1a456a" Mar 19 15:20:57 crc kubenswrapper[4771]: I0319 15:20:57.587652 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r6j98"] Mar 19 15:20:57 crc kubenswrapper[4771]: I0319 15:20:57.591920 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r6j98"] Mar 19 15:20:57 crc kubenswrapper[4771]: I0319 15:20:57.602715 4771 scope.go:117] "RemoveContainer" containerID="a6d9502b56a75d5027dea0c6187c409ea0e2713fb292f9eed7c550ba61df0c24" Mar 19 15:20:57 crc kubenswrapper[4771]: I0319 15:20:57.605456 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dkz7z" Mar 19 15:20:58 crc kubenswrapper[4771]: I0319 15:20:58.782783 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dkz7z"] Mar 19 15:20:59 crc kubenswrapper[4771]: I0319 15:20:59.093152 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" podUID="2db36f46-e19a-4b7d-a94f-157f65671639" containerName="oauth-openshift" containerID="cri-o://5eb5780eeaca6f8a841a5a4e07e72f7d687407a6a99c543e7f32501aea385d86" gracePeriod=15 Mar 19 15:20:59 crc kubenswrapper[4771]: I0319 15:20:59.516121 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d94fc0b-9a51-41a1-b346-767fa239b631" path="/var/lib/kubelet/pods/6d94fc0b-9a51-41a1-b346-767fa239b631/volumes" Mar 19 15:20:59 crc kubenswrapper[4771]: I0319 15:20:59.558481 4771 generic.go:334] "Generic (PLEG): container finished" podID="2db36f46-e19a-4b7d-a94f-157f65671639" containerID="5eb5780eeaca6f8a841a5a4e07e72f7d687407a6a99c543e7f32501aea385d86" exitCode=0 Mar 19 15:20:59 crc kubenswrapper[4771]: I0319 15:20:59.558606 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" event={"ID":"2db36f46-e19a-4b7d-a94f-157f65671639","Type":"ContainerDied","Data":"5eb5780eeaca6f8a841a5a4e07e72f7d687407a6a99c543e7f32501aea385d86"} Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.131128 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.294512 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-user-idp-0-file-data\") pod \"2db36f46-e19a-4b7d-a94f-157f65671639\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.294567 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-serving-cert\") pod \"2db36f46-e19a-4b7d-a94f-157f65671639\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.294597 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2db36f46-e19a-4b7d-a94f-157f65671639-audit-dir\") pod \"2db36f46-e19a-4b7d-a94f-157f65671639\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.294637 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-ocp-branding-template\") pod \"2db36f46-e19a-4b7d-a94f-157f65671639\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.294663 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-user-template-provider-selection\") pod \"2db36f46-e19a-4b7d-a94f-157f65671639\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.294687 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-router-certs\") pod \"2db36f46-e19a-4b7d-a94f-157f65671639\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.294740 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2db36f46-e19a-4b7d-a94f-157f65671639-audit-policies\") pod \"2db36f46-e19a-4b7d-a94f-157f65671639\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.294768 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-cliconfig\") pod \"2db36f46-e19a-4b7d-a94f-157f65671639\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.294809 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-user-template-error\") pod \"2db36f46-e19a-4b7d-a94f-157f65671639\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.294876 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-service-ca\") pod \"2db36f46-e19a-4b7d-a94f-157f65671639\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.294905 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppbbp\" (UniqueName: \"kubernetes.io/projected/2db36f46-e19a-4b7d-a94f-157f65671639-kube-api-access-ppbbp\") pod \"2db36f46-e19a-4b7d-a94f-157f65671639\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.294931 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-user-template-login\") pod \"2db36f46-e19a-4b7d-a94f-157f65671639\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.294957 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-trusted-ca-bundle\") pod \"2db36f46-e19a-4b7d-a94f-157f65671639\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.295002 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-session\") pod \"2db36f46-e19a-4b7d-a94f-157f65671639\" (UID: \"2db36f46-e19a-4b7d-a94f-157f65671639\") " Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.295263 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2db36f46-e19a-4b7d-a94f-157f65671639-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "2db36f46-e19a-4b7d-a94f-157f65671639" (UID: "2db36f46-e19a-4b7d-a94f-157f65671639"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.299166 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "2db36f46-e19a-4b7d-a94f-157f65671639" (UID: "2db36f46-e19a-4b7d-a94f-157f65671639"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.304312 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2db36f46-e19a-4b7d-a94f-157f65671639-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "2db36f46-e19a-4b7d-a94f-157f65671639" (UID: "2db36f46-e19a-4b7d-a94f-157f65671639"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.304970 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "2db36f46-e19a-4b7d-a94f-157f65671639" (UID: "2db36f46-e19a-4b7d-a94f-157f65671639"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.305034 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "2db36f46-e19a-4b7d-a94f-157f65671639" (UID: "2db36f46-e19a-4b7d-a94f-157f65671639"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.305312 4771 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2db36f46-e19a-4b7d-a94f-157f65671639-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.305340 4771 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2db36f46-e19a-4b7d-a94f-157f65671639-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.305359 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.305371 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.305390 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.316522 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "2db36f46-e19a-4b7d-a94f-157f65671639" (UID: "2db36f46-e19a-4b7d-a94f-157f65671639"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.318195 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2db36f46-e19a-4b7d-a94f-157f65671639-kube-api-access-ppbbp" (OuterVolumeSpecName: "kube-api-access-ppbbp") pod "2db36f46-e19a-4b7d-a94f-157f65671639" (UID: "2db36f46-e19a-4b7d-a94f-157f65671639"). InnerVolumeSpecName "kube-api-access-ppbbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.319866 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "2db36f46-e19a-4b7d-a94f-157f65671639" (UID: "2db36f46-e19a-4b7d-a94f-157f65671639"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.321060 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "2db36f46-e19a-4b7d-a94f-157f65671639" (UID: "2db36f46-e19a-4b7d-a94f-157f65671639"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.321383 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "2db36f46-e19a-4b7d-a94f-157f65671639" (UID: "2db36f46-e19a-4b7d-a94f-157f65671639"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.324101 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "2db36f46-e19a-4b7d-a94f-157f65671639" (UID: "2db36f46-e19a-4b7d-a94f-157f65671639"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.329099 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "2db36f46-e19a-4b7d-a94f-157f65671639" (UID: "2db36f46-e19a-4b7d-a94f-157f65671639"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.329410 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "2db36f46-e19a-4b7d-a94f-157f65671639" (UID: "2db36f46-e19a-4b7d-a94f-157f65671639"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.336159 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "2db36f46-e19a-4b7d-a94f-157f65671639" (UID: "2db36f46-e19a-4b7d-a94f-157f65671639"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.406678 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppbbp\" (UniqueName: \"kubernetes.io/projected/2db36f46-e19a-4b7d-a94f-157f65671639-kube-api-access-ppbbp\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.406714 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.406726 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.406737 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.406746 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.406756 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.406766 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.406775 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.406783 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2db36f46-e19a-4b7d-a94f-157f65671639-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.566680 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" event={"ID":"2db36f46-e19a-4b7d-a94f-157f65671639","Type":"ContainerDied","Data":"bd4d929b59f803fe666e4d85d49f72109ba2f8a6c8da095f0783f57978d76d10"} Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.566745 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-qj9qq" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.566773 4771 scope.go:117] "RemoveContainer" containerID="5eb5780eeaca6f8a841a5a4e07e72f7d687407a6a99c543e7f32501aea385d86" Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.566904 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dkz7z" podUID="1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07" containerName="registry-server" containerID="cri-o://685c5b9513f03af54481654afeec7492661ec4c44a7284aa2c7fb54081650b5a" gracePeriod=2 Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.600608 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qj9qq"] Mar 19 15:21:00 crc kubenswrapper[4771]: I0319 15:21:00.609035 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qj9qq"] Mar 19 15:21:01 crc kubenswrapper[4771]: I0319 15:21:01.025975 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dkz7z" Mar 19 15:21:01 crc kubenswrapper[4771]: I0319 15:21:01.217121 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k556n\" (UniqueName: \"kubernetes.io/projected/1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07-kube-api-access-k556n\") pod \"1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07\" (UID: \"1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07\") " Mar 19 15:21:01 crc kubenswrapper[4771]: I0319 15:21:01.217328 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07-catalog-content\") pod \"1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07\" (UID: \"1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07\") " Mar 19 15:21:01 crc kubenswrapper[4771]: I0319 15:21:01.217383 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07-utilities\") pod \"1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07\" (UID: \"1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07\") " Mar 19 15:21:01 crc kubenswrapper[4771]: I0319 15:21:01.218383 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07-utilities" (OuterVolumeSpecName: "utilities") pod "1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07" (UID: "1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:21:01 crc kubenswrapper[4771]: I0319 15:21:01.222639 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07-kube-api-access-k556n" (OuterVolumeSpecName: "kube-api-access-k556n") pod "1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07" (UID: "1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07"). InnerVolumeSpecName "kube-api-access-k556n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:21:01 crc kubenswrapper[4771]: I0319 15:21:01.318703 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:01 crc kubenswrapper[4771]: I0319 15:21:01.318741 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k556n\" (UniqueName: \"kubernetes.io/projected/1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07-kube-api-access-k556n\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:01 crc kubenswrapper[4771]: I0319 15:21:01.365170 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07" (UID: "1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:21:01 crc kubenswrapper[4771]: I0319 15:21:01.420173 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:01 crc kubenswrapper[4771]: I0319 15:21:01.520267 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2db36f46-e19a-4b7d-a94f-157f65671639" path="/var/lib/kubelet/pods/2db36f46-e19a-4b7d-a94f-157f65671639/volumes" Mar 19 15:21:01 crc kubenswrapper[4771]: I0319 15:21:01.576468 4771 generic.go:334] "Generic (PLEG): container finished" podID="1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07" containerID="685c5b9513f03af54481654afeec7492661ec4c44a7284aa2c7fb54081650b5a" exitCode=0 Mar 19 15:21:01 crc kubenswrapper[4771]: I0319 15:21:01.576556 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dkz7z" event={"ID":"1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07","Type":"ContainerDied","Data":"685c5b9513f03af54481654afeec7492661ec4c44a7284aa2c7fb54081650b5a"} Mar 19 15:21:01 crc kubenswrapper[4771]: I0319 15:21:01.576603 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dkz7z" Mar 19 15:21:01 crc kubenswrapper[4771]: I0319 15:21:01.576642 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dkz7z" event={"ID":"1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07","Type":"ContainerDied","Data":"a1728887f917bf4394aba75df6e60e54b6c80a605411f4268cde7757b084a068"} Mar 19 15:21:01 crc kubenswrapper[4771]: I0319 15:21:01.576675 4771 scope.go:117] "RemoveContainer" containerID="685c5b9513f03af54481654afeec7492661ec4c44a7284aa2c7fb54081650b5a" Mar 19 15:21:01 crc kubenswrapper[4771]: I0319 15:21:01.601451 4771 scope.go:117] "RemoveContainer" containerID="ba7700933f4df7669048b42b968e0f31c6b37f46cabdbbbe3dc933d65437035f" Mar 19 15:21:01 crc kubenswrapper[4771]: I0319 15:21:01.602232 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dkz7z"] Mar 19 15:21:01 crc kubenswrapper[4771]: I0319 15:21:01.609093 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dkz7z"] Mar 19 15:21:01 crc kubenswrapper[4771]: I0319 15:21:01.623725 4771 scope.go:117] "RemoveContainer" containerID="74a3f008345226ac855c1ffd682e82661be95a19d1e5a9259ef4c667d494013f" Mar 19 15:21:01 crc kubenswrapper[4771]: I0319 15:21:01.646259 4771 scope.go:117] "RemoveContainer" containerID="685c5b9513f03af54481654afeec7492661ec4c44a7284aa2c7fb54081650b5a" Mar 19 15:21:01 crc kubenswrapper[4771]: E0319 15:21:01.646769 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"685c5b9513f03af54481654afeec7492661ec4c44a7284aa2c7fb54081650b5a\": container with ID starting with 685c5b9513f03af54481654afeec7492661ec4c44a7284aa2c7fb54081650b5a not found: ID does not exist" containerID="685c5b9513f03af54481654afeec7492661ec4c44a7284aa2c7fb54081650b5a" Mar 19 15:21:01 crc kubenswrapper[4771]: I0319 15:21:01.646813 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"685c5b9513f03af54481654afeec7492661ec4c44a7284aa2c7fb54081650b5a"} err="failed to get container status \"685c5b9513f03af54481654afeec7492661ec4c44a7284aa2c7fb54081650b5a\": rpc error: code = NotFound desc = could not find container \"685c5b9513f03af54481654afeec7492661ec4c44a7284aa2c7fb54081650b5a\": container with ID starting with 685c5b9513f03af54481654afeec7492661ec4c44a7284aa2c7fb54081650b5a not found: ID does not exist" Mar 19 15:21:01 crc kubenswrapper[4771]: I0319 15:21:01.646841 4771 scope.go:117] "RemoveContainer" containerID="ba7700933f4df7669048b42b968e0f31c6b37f46cabdbbbe3dc933d65437035f" Mar 19 15:21:01 crc kubenswrapper[4771]: E0319 15:21:01.647424 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba7700933f4df7669048b42b968e0f31c6b37f46cabdbbbe3dc933d65437035f\": container with ID starting with ba7700933f4df7669048b42b968e0f31c6b37f46cabdbbbe3dc933d65437035f not found: ID does not exist" containerID="ba7700933f4df7669048b42b968e0f31c6b37f46cabdbbbe3dc933d65437035f" Mar 19 15:21:01 crc kubenswrapper[4771]: I0319 15:21:01.647509 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba7700933f4df7669048b42b968e0f31c6b37f46cabdbbbe3dc933d65437035f"} err="failed to get container status \"ba7700933f4df7669048b42b968e0f31c6b37f46cabdbbbe3dc933d65437035f\": rpc error: code = NotFound desc = could not find container \"ba7700933f4df7669048b42b968e0f31c6b37f46cabdbbbe3dc933d65437035f\": container with ID starting with ba7700933f4df7669048b42b968e0f31c6b37f46cabdbbbe3dc933d65437035f not found: ID does not exist" Mar 19 15:21:01 crc kubenswrapper[4771]: I0319 15:21:01.647591 4771 scope.go:117] "RemoveContainer" containerID="74a3f008345226ac855c1ffd682e82661be95a19d1e5a9259ef4c667d494013f" Mar 19 15:21:01 crc kubenswrapper[4771]: E0319 15:21:01.648407 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74a3f008345226ac855c1ffd682e82661be95a19d1e5a9259ef4c667d494013f\": container with ID starting with 74a3f008345226ac855c1ffd682e82661be95a19d1e5a9259ef4c667d494013f not found: ID does not exist" containerID="74a3f008345226ac855c1ffd682e82661be95a19d1e5a9259ef4c667d494013f" Mar 19 15:21:01 crc kubenswrapper[4771]: I0319 15:21:01.648442 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74a3f008345226ac855c1ffd682e82661be95a19d1e5a9259ef4c667d494013f"} err="failed to get container status \"74a3f008345226ac855c1ffd682e82661be95a19d1e5a9259ef4c667d494013f\": rpc error: code = NotFound desc = could not find container \"74a3f008345226ac855c1ffd682e82661be95a19d1e5a9259ef4c667d494013f\": container with ID starting with 74a3f008345226ac855c1ffd682e82661be95a19d1e5a9259ef4c667d494013f not found: ID does not exist" Mar 19 15:21:03 crc kubenswrapper[4771]: I0319 15:21:03.522847 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07" path="/var/lib/kubelet/pods/1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07/volumes" Mar 19 15:21:03 crc kubenswrapper[4771]: I0319 15:21:03.999733 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-79cb59f449-cs2kb"] Mar 19 15:21:04 crc kubenswrapper[4771]: E0319 15:21:04.000029 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d94fc0b-9a51-41a1-b346-767fa239b631" containerName="registry-server" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.000047 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d94fc0b-9a51-41a1-b346-767fa239b631" containerName="registry-server" Mar 19 15:21:04 crc kubenswrapper[4771]: E0319 15:21:04.000060 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07" containerName="registry-server" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.000066 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07" containerName="registry-server" Mar 19 15:21:04 crc kubenswrapper[4771]: E0319 15:21:04.000076 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07" containerName="extract-content" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.000083 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07" containerName="extract-content" Mar 19 15:21:04 crc kubenswrapper[4771]: E0319 15:21:04.000089 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d94fc0b-9a51-41a1-b346-767fa239b631" containerName="extract-utilities" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.000096 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d94fc0b-9a51-41a1-b346-767fa239b631" containerName="extract-utilities" Mar 19 15:21:04 crc kubenswrapper[4771]: E0319 15:21:04.000109 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07" containerName="extract-utilities" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.000117 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07" containerName="extract-utilities" Mar 19 15:21:04 crc kubenswrapper[4771]: E0319 15:21:04.000130 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d94fc0b-9a51-41a1-b346-767fa239b631" containerName="extract-content" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.000136 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d94fc0b-9a51-41a1-b346-767fa239b631" containerName="extract-content" Mar 19 15:21:04 crc kubenswrapper[4771]: E0319 15:21:04.000145 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db36f46-e19a-4b7d-a94f-157f65671639" containerName="oauth-openshift" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.000151 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db36f46-e19a-4b7d-a94f-157f65671639" containerName="oauth-openshift" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.000259 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="2db36f46-e19a-4b7d-a94f-157f65671639" containerName="oauth-openshift" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.000277 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d94fc0b-9a51-41a1-b346-767fa239b631" containerName="registry-server" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.000286 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d597ee0-bec9-4b6d-8d02-ce5a0b65fe07" containerName="registry-server" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.000794 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.004626 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.004700 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.006682 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.006710 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.006846 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.006917 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.006932 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.006950 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.006882 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.007502 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.016021 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.016107 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.017020 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.030275 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-79cb59f449-cs2kb"] Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.030661 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.034216 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.152169 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-audit-dir\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.152231 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.152291 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-audit-policies\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.152322 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-user-template-error\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.152367 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.152389 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-user-template-login\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.152562 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-system-router-certs\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.152642 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.152715 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.152754 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.152783 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2jzj\" (UniqueName: \"kubernetes.io/projected/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-kube-api-access-k2jzj\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.152876 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.152923 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-system-service-ca\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.152965 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-system-session\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.254093 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.254161 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2jzj\" (UniqueName: \"kubernetes.io/projected/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-kube-api-access-k2jzj\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.254230 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.254267 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-system-service-ca\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.254304 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-system-session\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.254342 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-audit-dir\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.254393 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.254449 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-audit-policies\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.254486 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-user-template-error\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.254543 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.254580 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-user-template-login\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.254657 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-system-router-certs\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.254703 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.254740 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.255118 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-system-service-ca\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.255206 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-audit-dir\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.255562 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.256201 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-audit-policies\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.256692 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.260408 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.260528 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-user-template-login\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.260839 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.261039 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.261481 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-system-session\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.262683 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.264229 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-system-router-certs\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.266162 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-v4-0-config-user-template-error\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.274698 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2jzj\" (UniqueName: \"kubernetes.io/projected/7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53-kube-api-access-k2jzj\") pod \"oauth-openshift-79cb59f449-cs2kb\" (UID: \"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53\") " pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.328789 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:04 crc kubenswrapper[4771]: I0319 15:21:04.810620 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-79cb59f449-cs2kb"] Mar 19 15:21:05 crc kubenswrapper[4771]: I0319 15:21:05.607577 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" event={"ID":"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53","Type":"ContainerStarted","Data":"e899c4a3a55a2b8dd063de2cd8258f2386514e6cffdb26be1ce443a3a7319262"} Mar 19 15:21:05 crc kubenswrapper[4771]: I0319 15:21:05.610147 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:05 crc kubenswrapper[4771]: I0319 15:21:05.610172 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" event={"ID":"7dfcf472-2fdc-48e0-af2b-e1cb9b1f5b53","Type":"ContainerStarted","Data":"0f380a25b19da45adf3dbdb883d8904bf8533ff1dd87290235e9ccee18d1a34a"} Mar 19 15:21:05 crc kubenswrapper[4771]: I0319 15:21:05.632493 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" podStartSLOduration=31.632472368 podStartE2EDuration="31.632472368s" podCreationTimestamp="2026-03-19 15:20:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:21:05.629775377 +0000 UTC m=+324.858396619" watchObservedRunningTime="2026-03-19 15:21:05.632472368 +0000 UTC m=+324.861093570" Mar 19 15:21:05 crc kubenswrapper[4771]: I0319 15:21:05.893185 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-79cb59f449-cs2kb" Mar 19 15:21:07 crc kubenswrapper[4771]: I0319 15:21:07.434928 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9"] Mar 19 15:21:07 crc kubenswrapper[4771]: I0319 15:21:07.435427 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9" podUID="f0dbfc2c-0887-43ae-91e7-41f38804b329" containerName="controller-manager" containerID="cri-o://fd19998a1cc897f296a232dd08f82a6c21905755d7b26a408b471f2658478638" gracePeriod=30 Mar 19 15:21:07 crc kubenswrapper[4771]: I0319 15:21:07.541501 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c"] Mar 19 15:21:07 crc kubenswrapper[4771]: I0319 15:21:07.541786 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c" podUID="4c3bf28f-7584-40d7-bff1-afd0078778cf" containerName="route-controller-manager" containerID="cri-o://a9f6e56f91e4934e289f2fedc734d017159cc650b7a941dd6d334667ecf1eea2" gracePeriod=30 Mar 19 15:21:07 crc kubenswrapper[4771]: I0319 15:21:07.627519 4771 generic.go:334] "Generic (PLEG): container finished" podID="f0dbfc2c-0887-43ae-91e7-41f38804b329" containerID="fd19998a1cc897f296a232dd08f82a6c21905755d7b26a408b471f2658478638" exitCode=0 Mar 19 15:21:07 crc kubenswrapper[4771]: I0319 15:21:07.628190 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9" event={"ID":"f0dbfc2c-0887-43ae-91e7-41f38804b329","Type":"ContainerDied","Data":"fd19998a1cc897f296a232dd08f82a6c21905755d7b26a408b471f2658478638"} Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.069669 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c" Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.074612 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9" Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.114804 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0dbfc2c-0887-43ae-91e7-41f38804b329-config\") pod \"f0dbfc2c-0887-43ae-91e7-41f38804b329\" (UID: \"f0dbfc2c-0887-43ae-91e7-41f38804b329\") " Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.114922 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c3bf28f-7584-40d7-bff1-afd0078778cf-config\") pod \"4c3bf28f-7584-40d7-bff1-afd0078778cf\" (UID: \"4c3bf28f-7584-40d7-bff1-afd0078778cf\") " Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.114981 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0dbfc2c-0887-43ae-91e7-41f38804b329-serving-cert\") pod \"f0dbfc2c-0887-43ae-91e7-41f38804b329\" (UID: \"f0dbfc2c-0887-43ae-91e7-41f38804b329\") " Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.115059 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c3bf28f-7584-40d7-bff1-afd0078778cf-serving-cert\") pod \"4c3bf28f-7584-40d7-bff1-afd0078778cf\" (UID: \"4c3bf28f-7584-40d7-bff1-afd0078778cf\") " Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.115118 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mr59\" (UniqueName: \"kubernetes.io/projected/f0dbfc2c-0887-43ae-91e7-41f38804b329-kube-api-access-8mr59\") pod \"f0dbfc2c-0887-43ae-91e7-41f38804b329\" (UID: \"f0dbfc2c-0887-43ae-91e7-41f38804b329\") " Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.115188 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f0dbfc2c-0887-43ae-91e7-41f38804b329-client-ca\") pod \"f0dbfc2c-0887-43ae-91e7-41f38804b329\" (UID: \"f0dbfc2c-0887-43ae-91e7-41f38804b329\") " Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.115235 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jm5p\" (UniqueName: \"kubernetes.io/projected/4c3bf28f-7584-40d7-bff1-afd0078778cf-kube-api-access-6jm5p\") pod \"4c3bf28f-7584-40d7-bff1-afd0078778cf\" (UID: \"4c3bf28f-7584-40d7-bff1-afd0078778cf\") " Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.115289 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c3bf28f-7584-40d7-bff1-afd0078778cf-client-ca\") pod \"4c3bf28f-7584-40d7-bff1-afd0078778cf\" (UID: \"4c3bf28f-7584-40d7-bff1-afd0078778cf\") " Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.115324 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f0dbfc2c-0887-43ae-91e7-41f38804b329-proxy-ca-bundles\") pod \"f0dbfc2c-0887-43ae-91e7-41f38804b329\" (UID: \"f0dbfc2c-0887-43ae-91e7-41f38804b329\") " Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.116340 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c3bf28f-7584-40d7-bff1-afd0078778cf-config" (OuterVolumeSpecName: "config") pod "4c3bf28f-7584-40d7-bff1-afd0078778cf" (UID: "4c3bf28f-7584-40d7-bff1-afd0078778cf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.116900 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0dbfc2c-0887-43ae-91e7-41f38804b329-client-ca" (OuterVolumeSpecName: "client-ca") pod "f0dbfc2c-0887-43ae-91e7-41f38804b329" (UID: "f0dbfc2c-0887-43ae-91e7-41f38804b329"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.116929 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0dbfc2c-0887-43ae-91e7-41f38804b329-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f0dbfc2c-0887-43ae-91e7-41f38804b329" (UID: "f0dbfc2c-0887-43ae-91e7-41f38804b329"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.116974 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0dbfc2c-0887-43ae-91e7-41f38804b329-config" (OuterVolumeSpecName: "config") pod "f0dbfc2c-0887-43ae-91e7-41f38804b329" (UID: "f0dbfc2c-0887-43ae-91e7-41f38804b329"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.117511 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c3bf28f-7584-40d7-bff1-afd0078778cf-client-ca" (OuterVolumeSpecName: "client-ca") pod "4c3bf28f-7584-40d7-bff1-afd0078778cf" (UID: "4c3bf28f-7584-40d7-bff1-afd0078778cf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.120853 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c3bf28f-7584-40d7-bff1-afd0078778cf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4c3bf28f-7584-40d7-bff1-afd0078778cf" (UID: "4c3bf28f-7584-40d7-bff1-afd0078778cf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.121138 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0dbfc2c-0887-43ae-91e7-41f38804b329-kube-api-access-8mr59" (OuterVolumeSpecName: "kube-api-access-8mr59") pod "f0dbfc2c-0887-43ae-91e7-41f38804b329" (UID: "f0dbfc2c-0887-43ae-91e7-41f38804b329"). InnerVolumeSpecName "kube-api-access-8mr59". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.121552 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0dbfc2c-0887-43ae-91e7-41f38804b329-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f0dbfc2c-0887-43ae-91e7-41f38804b329" (UID: "f0dbfc2c-0887-43ae-91e7-41f38804b329"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.122141 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c3bf28f-7584-40d7-bff1-afd0078778cf-kube-api-access-6jm5p" (OuterVolumeSpecName: "kube-api-access-6jm5p") pod "4c3bf28f-7584-40d7-bff1-afd0078778cf" (UID: "4c3bf28f-7584-40d7-bff1-afd0078778cf"). InnerVolumeSpecName "kube-api-access-6jm5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.217611 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f0dbfc2c-0887-43ae-91e7-41f38804b329-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.217954 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0dbfc2c-0887-43ae-91e7-41f38804b329-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.218162 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c3bf28f-7584-40d7-bff1-afd0078778cf-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.218305 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0dbfc2c-0887-43ae-91e7-41f38804b329-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.218423 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c3bf28f-7584-40d7-bff1-afd0078778cf-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.218538 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mr59\" (UniqueName: \"kubernetes.io/projected/f0dbfc2c-0887-43ae-91e7-41f38804b329-kube-api-access-8mr59\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.218684 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f0dbfc2c-0887-43ae-91e7-41f38804b329-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.218800 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jm5p\" (UniqueName: \"kubernetes.io/projected/4c3bf28f-7584-40d7-bff1-afd0078778cf-kube-api-access-6jm5p\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.218912 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c3bf28f-7584-40d7-bff1-afd0078778cf-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.636616 4771 generic.go:334] "Generic (PLEG): container finished" podID="4c3bf28f-7584-40d7-bff1-afd0078778cf" containerID="a9f6e56f91e4934e289f2fedc734d017159cc650b7a941dd6d334667ecf1eea2" exitCode=0 Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.636725 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c" event={"ID":"4c3bf28f-7584-40d7-bff1-afd0078778cf","Type":"ContainerDied","Data":"a9f6e56f91e4934e289f2fedc734d017159cc650b7a941dd6d334667ecf1eea2"} Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.636795 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c" event={"ID":"4c3bf28f-7584-40d7-bff1-afd0078778cf","Type":"ContainerDied","Data":"1d02c0f437872e6b0e7b869e497941bf79d48c02482a8940ef63978f5ddb3fe0"} Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.636824 4771 scope.go:117] "RemoveContainer" containerID="a9f6e56f91e4934e289f2fedc734d017159cc650b7a941dd6d334667ecf1eea2" Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.637011 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c" Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.641452 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9" event={"ID":"f0dbfc2c-0887-43ae-91e7-41f38804b329","Type":"ContainerDied","Data":"85ad19553e12a13d7c70ecad038ae63376024606effe59c254f908f3eaf54e5e"} Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.641628 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9" Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.668027 4771 scope.go:117] "RemoveContainer" containerID="a9f6e56f91e4934e289f2fedc734d017159cc650b7a941dd6d334667ecf1eea2" Mar 19 15:21:08 crc kubenswrapper[4771]: E0319 15:21:08.668646 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9f6e56f91e4934e289f2fedc734d017159cc650b7a941dd6d334667ecf1eea2\": container with ID starting with a9f6e56f91e4934e289f2fedc734d017159cc650b7a941dd6d334667ecf1eea2 not found: ID does not exist" containerID="a9f6e56f91e4934e289f2fedc734d017159cc650b7a941dd6d334667ecf1eea2" Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.668737 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9f6e56f91e4934e289f2fedc734d017159cc650b7a941dd6d334667ecf1eea2"} err="failed to get container status \"a9f6e56f91e4934e289f2fedc734d017159cc650b7a941dd6d334667ecf1eea2\": rpc error: code = NotFound desc = could not find container \"a9f6e56f91e4934e289f2fedc734d017159cc650b7a941dd6d334667ecf1eea2\": container with ID starting with a9f6e56f91e4934e289f2fedc734d017159cc650b7a941dd6d334667ecf1eea2 not found: ID does not exist" Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.668774 4771 scope.go:117] "RemoveContainer" containerID="fd19998a1cc897f296a232dd08f82a6c21905755d7b26a408b471f2658478638" Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.677063 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c"] Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.689564 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8487dc6694-djh8c"] Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.696312 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9"] Mar 19 15:21:08 crc kubenswrapper[4771]: I0319 15:21:08.701415 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6c9fbfc69d-ztbf9"] Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.002080 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-785798cc6b-b6lrf"] Mar 19 15:21:09 crc kubenswrapper[4771]: E0319 15:21:09.002296 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c3bf28f-7584-40d7-bff1-afd0078778cf" containerName="route-controller-manager" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.002308 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c3bf28f-7584-40d7-bff1-afd0078778cf" containerName="route-controller-manager" Mar 19 15:21:09 crc kubenswrapper[4771]: E0319 15:21:09.002318 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0dbfc2c-0887-43ae-91e7-41f38804b329" containerName="controller-manager" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.002325 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0dbfc2c-0887-43ae-91e7-41f38804b329" containerName="controller-manager" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.002444 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0dbfc2c-0887-43ae-91e7-41f38804b329" containerName="controller-manager" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.002455 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c3bf28f-7584-40d7-bff1-afd0078778cf" containerName="route-controller-manager" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.002861 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-785798cc6b-b6lrf" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.006851 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.007456 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.007607 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.008608 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.009372 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.009720 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86d44487b6-gp7tq"] Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.010033 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.011018 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86d44487b6-gp7tq" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.013659 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.014542 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.015371 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.016236 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.016362 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.018027 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-785798cc6b-b6lrf"] Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.021555 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.021849 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86d44487b6-gp7tq"] Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.023609 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.029805 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x98mn\" (UniqueName: \"kubernetes.io/projected/9c56f9b9-fbb8-45da-8549-a13ed25d393c-kube-api-access-x98mn\") pod \"controller-manager-785798cc6b-b6lrf\" (UID: \"9c56f9b9-fbb8-45da-8549-a13ed25d393c\") " pod="openshift-controller-manager/controller-manager-785798cc6b-b6lrf" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.029848 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/971f8cfe-7117-4ab1-9764-61ea28840c4e-serving-cert\") pod \"route-controller-manager-86d44487b6-gp7tq\" (UID: \"971f8cfe-7117-4ab1-9764-61ea28840c4e\") " pod="openshift-route-controller-manager/route-controller-manager-86d44487b6-gp7tq" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.029890 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c56f9b9-fbb8-45da-8549-a13ed25d393c-serving-cert\") pod \"controller-manager-785798cc6b-b6lrf\" (UID: \"9c56f9b9-fbb8-45da-8549-a13ed25d393c\") " pod="openshift-controller-manager/controller-manager-785798cc6b-b6lrf" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.029908 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c56f9b9-fbb8-45da-8549-a13ed25d393c-config\") pod \"controller-manager-785798cc6b-b6lrf\" (UID: \"9c56f9b9-fbb8-45da-8549-a13ed25d393c\") " pod="openshift-controller-manager/controller-manager-785798cc6b-b6lrf" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.029929 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9c56f9b9-fbb8-45da-8549-a13ed25d393c-proxy-ca-bundles\") pod \"controller-manager-785798cc6b-b6lrf\" (UID: \"9c56f9b9-fbb8-45da-8549-a13ed25d393c\") " pod="openshift-controller-manager/controller-manager-785798cc6b-b6lrf" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.029956 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/971f8cfe-7117-4ab1-9764-61ea28840c4e-config\") pod \"route-controller-manager-86d44487b6-gp7tq\" (UID: \"971f8cfe-7117-4ab1-9764-61ea28840c4e\") " pod="openshift-route-controller-manager/route-controller-manager-86d44487b6-gp7tq" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.030004 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/971f8cfe-7117-4ab1-9764-61ea28840c4e-client-ca\") pod \"route-controller-manager-86d44487b6-gp7tq\" (UID: \"971f8cfe-7117-4ab1-9764-61ea28840c4e\") " pod="openshift-route-controller-manager/route-controller-manager-86d44487b6-gp7tq" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.030023 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trffm\" (UniqueName: \"kubernetes.io/projected/971f8cfe-7117-4ab1-9764-61ea28840c4e-kube-api-access-trffm\") pod \"route-controller-manager-86d44487b6-gp7tq\" (UID: \"971f8cfe-7117-4ab1-9764-61ea28840c4e\") " pod="openshift-route-controller-manager/route-controller-manager-86d44487b6-gp7tq" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.030045 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c56f9b9-fbb8-45da-8549-a13ed25d393c-client-ca\") pod \"controller-manager-785798cc6b-b6lrf\" (UID: \"9c56f9b9-fbb8-45da-8549-a13ed25d393c\") " pod="openshift-controller-manager/controller-manager-785798cc6b-b6lrf" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.131241 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9c56f9b9-fbb8-45da-8549-a13ed25d393c-proxy-ca-bundles\") pod \"controller-manager-785798cc6b-b6lrf\" (UID: \"9c56f9b9-fbb8-45da-8549-a13ed25d393c\") " pod="openshift-controller-manager/controller-manager-785798cc6b-b6lrf" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.131286 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/971f8cfe-7117-4ab1-9764-61ea28840c4e-config\") pod \"route-controller-manager-86d44487b6-gp7tq\" (UID: \"971f8cfe-7117-4ab1-9764-61ea28840c4e\") " pod="openshift-route-controller-manager/route-controller-manager-86d44487b6-gp7tq" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.131325 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/971f8cfe-7117-4ab1-9764-61ea28840c4e-client-ca\") pod \"route-controller-manager-86d44487b6-gp7tq\" (UID: \"971f8cfe-7117-4ab1-9764-61ea28840c4e\") " pod="openshift-route-controller-manager/route-controller-manager-86d44487b6-gp7tq" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.131344 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trffm\" (UniqueName: \"kubernetes.io/projected/971f8cfe-7117-4ab1-9764-61ea28840c4e-kube-api-access-trffm\") pod \"route-controller-manager-86d44487b6-gp7tq\" (UID: \"971f8cfe-7117-4ab1-9764-61ea28840c4e\") " pod="openshift-route-controller-manager/route-controller-manager-86d44487b6-gp7tq" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.131362 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c56f9b9-fbb8-45da-8549-a13ed25d393c-client-ca\") pod \"controller-manager-785798cc6b-b6lrf\" (UID: \"9c56f9b9-fbb8-45da-8549-a13ed25d393c\") " pod="openshift-controller-manager/controller-manager-785798cc6b-b6lrf" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.131454 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x98mn\" (UniqueName: \"kubernetes.io/projected/9c56f9b9-fbb8-45da-8549-a13ed25d393c-kube-api-access-x98mn\") pod \"controller-manager-785798cc6b-b6lrf\" (UID: \"9c56f9b9-fbb8-45da-8549-a13ed25d393c\") " pod="openshift-controller-manager/controller-manager-785798cc6b-b6lrf" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.131484 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/971f8cfe-7117-4ab1-9764-61ea28840c4e-serving-cert\") pod \"route-controller-manager-86d44487b6-gp7tq\" (UID: \"971f8cfe-7117-4ab1-9764-61ea28840c4e\") " pod="openshift-route-controller-manager/route-controller-manager-86d44487b6-gp7tq" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.131529 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c56f9b9-fbb8-45da-8549-a13ed25d393c-serving-cert\") pod \"controller-manager-785798cc6b-b6lrf\" (UID: \"9c56f9b9-fbb8-45da-8549-a13ed25d393c\") " pod="openshift-controller-manager/controller-manager-785798cc6b-b6lrf" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.131550 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c56f9b9-fbb8-45da-8549-a13ed25d393c-config\") pod \"controller-manager-785798cc6b-b6lrf\" (UID: \"9c56f9b9-fbb8-45da-8549-a13ed25d393c\") " pod="openshift-controller-manager/controller-manager-785798cc6b-b6lrf" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.132637 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9c56f9b9-fbb8-45da-8549-a13ed25d393c-proxy-ca-bundles\") pod \"controller-manager-785798cc6b-b6lrf\" (UID: \"9c56f9b9-fbb8-45da-8549-a13ed25d393c\") " pod="openshift-controller-manager/controller-manager-785798cc6b-b6lrf" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.132665 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c56f9b9-fbb8-45da-8549-a13ed25d393c-client-ca\") pod \"controller-manager-785798cc6b-b6lrf\" (UID: \"9c56f9b9-fbb8-45da-8549-a13ed25d393c\") " pod="openshift-controller-manager/controller-manager-785798cc6b-b6lrf" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.133290 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/971f8cfe-7117-4ab1-9764-61ea28840c4e-client-ca\") pod \"route-controller-manager-86d44487b6-gp7tq\" (UID: \"971f8cfe-7117-4ab1-9764-61ea28840c4e\") " pod="openshift-route-controller-manager/route-controller-manager-86d44487b6-gp7tq" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.133287 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c56f9b9-fbb8-45da-8549-a13ed25d393c-config\") pod \"controller-manager-785798cc6b-b6lrf\" (UID: \"9c56f9b9-fbb8-45da-8549-a13ed25d393c\") " pod="openshift-controller-manager/controller-manager-785798cc6b-b6lrf" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.134625 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/971f8cfe-7117-4ab1-9764-61ea28840c4e-config\") pod \"route-controller-manager-86d44487b6-gp7tq\" (UID: \"971f8cfe-7117-4ab1-9764-61ea28840c4e\") " pod="openshift-route-controller-manager/route-controller-manager-86d44487b6-gp7tq" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.137802 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/971f8cfe-7117-4ab1-9764-61ea28840c4e-serving-cert\") pod \"route-controller-manager-86d44487b6-gp7tq\" (UID: \"971f8cfe-7117-4ab1-9764-61ea28840c4e\") " pod="openshift-route-controller-manager/route-controller-manager-86d44487b6-gp7tq" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.140101 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c56f9b9-fbb8-45da-8549-a13ed25d393c-serving-cert\") pod \"controller-manager-785798cc6b-b6lrf\" (UID: \"9c56f9b9-fbb8-45da-8549-a13ed25d393c\") " pod="openshift-controller-manager/controller-manager-785798cc6b-b6lrf" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.148035 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x98mn\" (UniqueName: \"kubernetes.io/projected/9c56f9b9-fbb8-45da-8549-a13ed25d393c-kube-api-access-x98mn\") pod \"controller-manager-785798cc6b-b6lrf\" (UID: \"9c56f9b9-fbb8-45da-8549-a13ed25d393c\") " pod="openshift-controller-manager/controller-manager-785798cc6b-b6lrf" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.158554 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trffm\" (UniqueName: \"kubernetes.io/projected/971f8cfe-7117-4ab1-9764-61ea28840c4e-kube-api-access-trffm\") pod \"route-controller-manager-86d44487b6-gp7tq\" (UID: \"971f8cfe-7117-4ab1-9764-61ea28840c4e\") " pod="openshift-route-controller-manager/route-controller-manager-86d44487b6-gp7tq" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.346649 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-785798cc6b-b6lrf" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.367763 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86d44487b6-gp7tq" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.517903 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c3bf28f-7584-40d7-bff1-afd0078778cf" path="/var/lib/kubelet/pods/4c3bf28f-7584-40d7-bff1-afd0078778cf/volumes" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.519222 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0dbfc2c-0887-43ae-91e7-41f38804b329" path="/var/lib/kubelet/pods/f0dbfc2c-0887-43ae-91e7-41f38804b329/volumes" Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.564400 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-785798cc6b-b6lrf"] Mar 19 15:21:09 crc kubenswrapper[4771]: W0319 15:21:09.576633 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c56f9b9_fbb8_45da_8549_a13ed25d393c.slice/crio-86f2701e5ad917ba07c4a62f50e66f8bf6ffa9285eb8d5b4bda8b19b3bc3f6ba WatchSource:0}: Error finding container 86f2701e5ad917ba07c4a62f50e66f8bf6ffa9285eb8d5b4bda8b19b3bc3f6ba: Status 404 returned error can't find the container with id 86f2701e5ad917ba07c4a62f50e66f8bf6ffa9285eb8d5b4bda8b19b3bc3f6ba Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.624264 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86d44487b6-gp7tq"] Mar 19 15:21:09 crc kubenswrapper[4771]: W0319 15:21:09.632739 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod971f8cfe_7117_4ab1_9764_61ea28840c4e.slice/crio-7f0b1b39d353e1669f743737105c95c6c310a035aca4fed36c19369d0b87a089 WatchSource:0}: Error finding container 7f0b1b39d353e1669f743737105c95c6c310a035aca4fed36c19369d0b87a089: Status 404 returned error can't find the container with id 7f0b1b39d353e1669f743737105c95c6c310a035aca4fed36c19369d0b87a089 Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.662865 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-785798cc6b-b6lrf" event={"ID":"9c56f9b9-fbb8-45da-8549-a13ed25d393c","Type":"ContainerStarted","Data":"86f2701e5ad917ba07c4a62f50e66f8bf6ffa9285eb8d5b4bda8b19b3bc3f6ba"} Mar 19 15:21:09 crc kubenswrapper[4771]: I0319 15:21:09.665750 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86d44487b6-gp7tq" event={"ID":"971f8cfe-7117-4ab1-9764-61ea28840c4e","Type":"ContainerStarted","Data":"7f0b1b39d353e1669f743737105c95c6c310a035aca4fed36c19369d0b87a089"} Mar 19 15:21:10 crc kubenswrapper[4771]: I0319 15:21:10.674161 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86d44487b6-gp7tq" event={"ID":"971f8cfe-7117-4ab1-9764-61ea28840c4e","Type":"ContainerStarted","Data":"e890fb3d87fd24e3ea7c73b48ae569a46c8b52a034a73d2af3785d8126140e92"} Mar 19 15:21:10 crc kubenswrapper[4771]: I0319 15:21:10.674532 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-86d44487b6-gp7tq" Mar 19 15:21:10 crc kubenswrapper[4771]: I0319 15:21:10.676264 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-785798cc6b-b6lrf" event={"ID":"9c56f9b9-fbb8-45da-8549-a13ed25d393c","Type":"ContainerStarted","Data":"b528ebffa93ccd38401f563fe35c666857d28ee43fb2e5e6a52ef869c1805f87"} Mar 19 15:21:10 crc kubenswrapper[4771]: I0319 15:21:10.677156 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-785798cc6b-b6lrf" Mar 19 15:21:10 crc kubenswrapper[4771]: I0319 15:21:10.689135 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-86d44487b6-gp7tq" Mar 19 15:21:10 crc kubenswrapper[4771]: I0319 15:21:10.692883 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-785798cc6b-b6lrf" Mar 19 15:21:10 crc kubenswrapper[4771]: I0319 15:21:10.699856 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-86d44487b6-gp7tq" podStartSLOduration=3.699830876 podStartE2EDuration="3.699830876s" podCreationTimestamp="2026-03-19 15:21:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:21:10.692436913 +0000 UTC m=+329.921058155" watchObservedRunningTime="2026-03-19 15:21:10.699830876 +0000 UTC m=+329.928452108" Mar 19 15:21:10 crc kubenswrapper[4771]: I0319 15:21:10.724140 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-785798cc6b-b6lrf" podStartSLOduration=3.7241085590000003 podStartE2EDuration="3.724108559s" podCreationTimestamp="2026-03-19 15:21:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:21:10.717462866 +0000 UTC m=+329.946084108" watchObservedRunningTime="2026-03-19 15:21:10.724108559 +0000 UTC m=+329.952729801" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.533296 4771 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.535035 4771 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.535228 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.535404 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59" gracePeriod=15 Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.535466 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913" gracePeriod=15 Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.535556 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695" gracePeriod=15 Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.535633 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03" gracePeriod=15 Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.536276 4771 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.535580 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344" gracePeriod=15 Mar 19 15:21:19 crc kubenswrapper[4771]: E0319 15:21:19.536580 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.536600 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 19 15:21:19 crc kubenswrapper[4771]: E0319 15:21:19.536614 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.536628 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 15:21:19 crc kubenswrapper[4771]: E0319 15:21:19.536646 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.536658 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 19 15:21:19 crc kubenswrapper[4771]: E0319 15:21:19.536678 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.536690 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 15:21:19 crc kubenswrapper[4771]: E0319 15:21:19.536706 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.536718 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 19 15:21:19 crc kubenswrapper[4771]: E0319 15:21:19.536733 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.536745 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 15:21:19 crc kubenswrapper[4771]: E0319 15:21:19.536760 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.536771 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 19 15:21:19 crc kubenswrapper[4771]: E0319 15:21:19.536798 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.536811 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 15:21:19 crc kubenswrapper[4771]: E0319 15:21:19.536827 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.536840 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.537142 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.537161 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.537176 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.537194 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.537216 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.537237 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.537256 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 19 15:21:19 crc kubenswrapper[4771]: E0319 15:21:19.537482 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.537497 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.537711 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.537733 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.581061 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.582424 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.582628 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.582845 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.582919 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.582951 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.584714 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.584799 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.585056 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.686204 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.686324 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.686501 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.686571 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.686611 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.686638 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.686689 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.686747 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.686890 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.686951 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.687023 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.687064 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.687105 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.687144 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.687184 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.687222 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.745536 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.746801 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.747664 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913" exitCode=0 Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.747706 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344" exitCode=2 Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.747754 4771 scope.go:117] "RemoveContainer" containerID="efec32569baf7f2ab751a3a5379cea252d2026eff475f352bbbe3f662d8580b6" Mar 19 15:21:19 crc kubenswrapper[4771]: I0319 15:21:19.873172 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 15:21:19 crc kubenswrapper[4771]: E0319 15:21:19.925352 4771 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.50:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e47487eeedf90 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:21:19.923355536 +0000 UTC m=+339.151976738,LastTimestamp:2026-03-19 15:21:19.923355536 +0000 UTC m=+339.151976738,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:21:20 crc kubenswrapper[4771]: I0319 15:21:20.757355 4771 generic.go:334] "Generic (PLEG): container finished" podID="6730def2-f246-44ea-9ee9-93f4e490008d" containerID="ee0f7aacd635440fd47722a932fb07df752eb188e88672ae56dc9d9d9bd0df8d" exitCode=0 Mar 19 15:21:20 crc kubenswrapper[4771]: I0319 15:21:20.757445 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6730def2-f246-44ea-9ee9-93f4e490008d","Type":"ContainerDied","Data":"ee0f7aacd635440fd47722a932fb07df752eb188e88672ae56dc9d9d9bd0df8d"} Mar 19 15:21:20 crc kubenswrapper[4771]: I0319 15:21:20.758796 4771 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:20 crc kubenswrapper[4771]: I0319 15:21:20.759366 4771 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:20 crc kubenswrapper[4771]: I0319 15:21:20.759746 4771 status_manager.go:851] "Failed to get status for pod" podUID="6730def2-f246-44ea-9ee9-93f4e490008d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:20 crc kubenswrapper[4771]: I0319 15:21:20.760094 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"9bd0cb88dd917a6b19b969bb2c31584f8beebc52576d700c3ba9969ff0995a2d"} Mar 19 15:21:20 crc kubenswrapper[4771]: I0319 15:21:20.760162 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"ee997d6d279e3c5dc6e505f45ed44aff64a1f16aa45b85acf37de32d2c42831d"} Mar 19 15:21:20 crc kubenswrapper[4771]: I0319 15:21:20.760669 4771 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:20 crc kubenswrapper[4771]: I0319 15:21:20.761156 4771 status_manager.go:851] "Failed to get status for pod" podUID="6730def2-f246-44ea-9ee9-93f4e490008d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:20 crc kubenswrapper[4771]: I0319 15:21:20.761505 4771 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:20 crc kubenswrapper[4771]: I0319 15:21:20.763727 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 15:21:20 crc kubenswrapper[4771]: I0319 15:21:20.764835 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03" exitCode=0 Mar 19 15:21:20 crc kubenswrapper[4771]: I0319 15:21:20.764891 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695" exitCode=0 Mar 19 15:21:20 crc kubenswrapper[4771]: E0319 15:21:20.800413 4771 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.50:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e47487eeedf90 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:21:19.923355536 +0000 UTC m=+339.151976738,LastTimestamp:2026-03-19 15:21:19.923355536 +0000 UTC m=+339.151976738,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:21:21 crc kubenswrapper[4771]: I0319 15:21:21.082135 4771 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Mar 19 15:21:21 crc kubenswrapper[4771]: I0319 15:21:21.082215 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Mar 19 15:21:21 crc kubenswrapper[4771]: I0319 15:21:21.513027 4771 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:21 crc kubenswrapper[4771]: I0319 15:21:21.513441 4771 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:21 crc kubenswrapper[4771]: I0319 15:21:21.513853 4771 status_manager.go:851] "Failed to get status for pod" podUID="6730def2-f246-44ea-9ee9-93f4e490008d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:21 crc kubenswrapper[4771]: I0319 15:21:21.901768 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 15:21:21 crc kubenswrapper[4771]: I0319 15:21:21.902783 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:21:21 crc kubenswrapper[4771]: I0319 15:21:21.903709 4771 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:21 crc kubenswrapper[4771]: I0319 15:21:21.905368 4771 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:21 crc kubenswrapper[4771]: I0319 15:21:21.905571 4771 status_manager.go:851] "Failed to get status for pod" podUID="6730def2-f246-44ea-9ee9-93f4e490008d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:21 crc kubenswrapper[4771]: I0319 15:21:21.993486 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 19 15:21:21 crc kubenswrapper[4771]: I0319 15:21:21.993555 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 19 15:21:21 crc kubenswrapper[4771]: I0319 15:21:21.993579 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:21:21 crc kubenswrapper[4771]: I0319 15:21:21.993613 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 19 15:21:21 crc kubenswrapper[4771]: I0319 15:21:21.993638 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:21:21 crc kubenswrapper[4771]: I0319 15:21:21.993729 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:21:21 crc kubenswrapper[4771]: I0319 15:21:21.993914 4771 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:21 crc kubenswrapper[4771]: I0319 15:21:21.993927 4771 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:21 crc kubenswrapper[4771]: I0319 15:21:21.993935 4771 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.088291 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.089676 4771 status_manager.go:851] "Failed to get status for pod" podUID="6730def2-f246-44ea-9ee9-93f4e490008d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.090199 4771 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.090906 4771 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.197165 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6730def2-f246-44ea-9ee9-93f4e490008d-kube-api-access\") pod \"6730def2-f246-44ea-9ee9-93f4e490008d\" (UID: \"6730def2-f246-44ea-9ee9-93f4e490008d\") " Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.197334 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6730def2-f246-44ea-9ee9-93f4e490008d-kubelet-dir\") pod \"6730def2-f246-44ea-9ee9-93f4e490008d\" (UID: \"6730def2-f246-44ea-9ee9-93f4e490008d\") " Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.197389 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6730def2-f246-44ea-9ee9-93f4e490008d-var-lock\") pod \"6730def2-f246-44ea-9ee9-93f4e490008d\" (UID: \"6730def2-f246-44ea-9ee9-93f4e490008d\") " Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.197525 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6730def2-f246-44ea-9ee9-93f4e490008d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6730def2-f246-44ea-9ee9-93f4e490008d" (UID: "6730def2-f246-44ea-9ee9-93f4e490008d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.197658 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6730def2-f246-44ea-9ee9-93f4e490008d-var-lock" (OuterVolumeSpecName: "var-lock") pod "6730def2-f246-44ea-9ee9-93f4e490008d" (UID: "6730def2-f246-44ea-9ee9-93f4e490008d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.197892 4771 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6730def2-f246-44ea-9ee9-93f4e490008d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.197932 4771 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6730def2-f246-44ea-9ee9-93f4e490008d-var-lock\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.206491 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6730def2-f246-44ea-9ee9-93f4e490008d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6730def2-f246-44ea-9ee9-93f4e490008d" (UID: "6730def2-f246-44ea-9ee9-93f4e490008d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.299129 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6730def2-f246-44ea-9ee9-93f4e490008d-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.779062 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6730def2-f246-44ea-9ee9-93f4e490008d","Type":"ContainerDied","Data":"697d9972d87841f97f90f55f0213281add336109b44c0917d95d48f781238112"} Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.779098 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.779108 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="697d9972d87841f97f90f55f0213281add336109b44c0917d95d48f781238112" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.787038 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.788449 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59" exitCode=0 Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.788521 4771 scope.go:117] "RemoveContainer" containerID="4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.788675 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.810477 4771 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.810911 4771 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.811214 4771 status_manager.go:851] "Failed to get status for pod" podUID="6730def2-f246-44ea-9ee9-93f4e490008d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.821447 4771 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.821570 4771 scope.go:117] "RemoveContainer" containerID="3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.822271 4771 status_manager.go:851] "Failed to get status for pod" podUID="6730def2-f246-44ea-9ee9-93f4e490008d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.823124 4771 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.845921 4771 scope.go:117] "RemoveContainer" containerID="f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.864236 4771 scope.go:117] "RemoveContainer" containerID="f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.878485 4771 scope.go:117] "RemoveContainer" containerID="08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.904705 4771 scope.go:117] "RemoveContainer" containerID="e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.926300 4771 scope.go:117] "RemoveContainer" containerID="4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913" Mar 19 15:21:22 crc kubenswrapper[4771]: E0319 15:21:22.927079 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\": container with ID starting with 4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913 not found: ID does not exist" containerID="4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.927683 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913"} err="failed to get container status \"4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\": rpc error: code = NotFound desc = could not find container \"4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913\": container with ID starting with 4f8fe08e1ab939f664899d19125ac73aabcd08d7c765b151d3385ad32d952913 not found: ID does not exist" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.927754 4771 scope.go:117] "RemoveContainer" containerID="3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03" Mar 19 15:21:22 crc kubenswrapper[4771]: E0319 15:21:22.930281 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\": container with ID starting with 3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03 not found: ID does not exist" containerID="3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.930320 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03"} err="failed to get container status \"3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\": rpc error: code = NotFound desc = could not find container \"3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03\": container with ID starting with 3fd02057b09e73164de473c76a93c8d78dcb413bbbf2e907195a50ac3f6aea03 not found: ID does not exist" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.930349 4771 scope.go:117] "RemoveContainer" containerID="f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695" Mar 19 15:21:22 crc kubenswrapper[4771]: E0319 15:21:22.930947 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\": container with ID starting with f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695 not found: ID does not exist" containerID="f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.931012 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695"} err="failed to get container status \"f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\": rpc error: code = NotFound desc = could not find container \"f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695\": container with ID starting with f3f6c9a8ee1d260bb3ccfa44e7c372d00df2ff2866a93c57d45d24ee7438f695 not found: ID does not exist" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.931042 4771 scope.go:117] "RemoveContainer" containerID="f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344" Mar 19 15:21:22 crc kubenswrapper[4771]: E0319 15:21:22.931454 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\": container with ID starting with f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344 not found: ID does not exist" containerID="f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.931568 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344"} err="failed to get container status \"f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\": rpc error: code = NotFound desc = could not find container \"f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344\": container with ID starting with f7004f6368d8f320bb78de63cd858433be78f2733888082ab46f7df04e7c7344 not found: ID does not exist" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.931601 4771 scope.go:117] "RemoveContainer" containerID="08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59" Mar 19 15:21:22 crc kubenswrapper[4771]: E0319 15:21:22.931902 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\": container with ID starting with 08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59 not found: ID does not exist" containerID="08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.931934 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59"} err="failed to get container status \"08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\": rpc error: code = NotFound desc = could not find container \"08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59\": container with ID starting with 08cb1a2eab93888fa55acb10dc90a7ce5f5c4f59e1bdf626ac6131efe24b0a59 not found: ID does not exist" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.931962 4771 scope.go:117] "RemoveContainer" containerID="e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434" Mar 19 15:21:22 crc kubenswrapper[4771]: E0319 15:21:22.932378 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\": container with ID starting with e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434 not found: ID does not exist" containerID="e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434" Mar 19 15:21:22 crc kubenswrapper[4771]: I0319 15:21:22.932441 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434"} err="failed to get container status \"e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\": rpc error: code = NotFound desc = could not find container \"e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434\": container with ID starting with e5f528b8171d5333fda98113022391d3181d5e4bdba938ede3b171f585c11434 not found: ID does not exist" Mar 19 15:21:23 crc kubenswrapper[4771]: I0319 15:21:23.519656 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 19 15:21:24 crc kubenswrapper[4771]: E0319 15:21:24.172018 4771 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:24 crc kubenswrapper[4771]: E0319 15:21:24.172573 4771 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:24 crc kubenswrapper[4771]: E0319 15:21:24.173017 4771 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:24 crc kubenswrapper[4771]: E0319 15:21:24.173516 4771 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:24 crc kubenswrapper[4771]: E0319 15:21:24.173937 4771 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:24 crc kubenswrapper[4771]: I0319 15:21:24.173970 4771 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 19 15:21:24 crc kubenswrapper[4771]: E0319 15:21:24.174257 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="200ms" Mar 19 15:21:24 crc kubenswrapper[4771]: E0319 15:21:24.375512 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="400ms" Mar 19 15:21:24 crc kubenswrapper[4771]: E0319 15:21:24.775927 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="800ms" Mar 19 15:21:25 crc kubenswrapper[4771]: I0319 15:21:25.450290 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:21:25 crc kubenswrapper[4771]: I0319 15:21:25.450685 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:21:25 crc kubenswrapper[4771]: W0319 15:21:25.451365 4771 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27267": dial tcp 38.102.83.50:6443: connect: connection refused Mar 19 15:21:25 crc kubenswrapper[4771]: E0319 15:21:25.451457 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27267\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 19 15:21:25 crc kubenswrapper[4771]: W0319 15:21:25.451545 4771 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27267": dial tcp 38.102.83.50:6443: connect: connection refused Mar 19 15:21:25 crc kubenswrapper[4771]: E0319 15:21:25.451599 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27267\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 19 15:21:25 crc kubenswrapper[4771]: I0319 15:21:25.552433 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:21:25 crc kubenswrapper[4771]: I0319 15:21:25.552483 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:21:25 crc kubenswrapper[4771]: W0319 15:21:25.553111 4771 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27267": dial tcp 38.102.83.50:6443: connect: connection refused Mar 19 15:21:25 crc kubenswrapper[4771]: E0319 15:21:25.553215 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27267\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 19 15:21:25 crc kubenswrapper[4771]: E0319 15:21:25.576524 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="1.6s" Mar 19 15:21:26 crc kubenswrapper[4771]: E0319 15:21:26.451843 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: failed to sync configmap cache: timed out waiting for the condition Mar 19 15:21:26 crc kubenswrapper[4771]: E0319 15:21:26.451861 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 15:21:26 crc kubenswrapper[4771]: E0319 15:21:26.451920 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 15:23:28.45190028 +0000 UTC m=+467.680521482 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync configmap cache: timed out waiting for the condition Mar 19 15:21:26 crc kubenswrapper[4771]: E0319 15:21:26.451974 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 15:23:28.451944361 +0000 UTC m=+467.680565603 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync secret cache: timed out waiting for the condition Mar 19 15:21:26 crc kubenswrapper[4771]: E0319 15:21:26.552671 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 19 15:21:26 crc kubenswrapper[4771]: E0319 15:21:26.552801 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 19 15:21:26 crc kubenswrapper[4771]: W0319 15:21:26.553477 4771 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27267": dial tcp 38.102.83.50:6443: connect: connection refused Mar 19 15:21:26 crc kubenswrapper[4771]: E0319 15:21:26.553572 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27267\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 19 15:21:27 crc kubenswrapper[4771]: E0319 15:21:27.178282 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="3.2s" Mar 19 15:21:27 crc kubenswrapper[4771]: E0319 15:21:27.553554 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 19 15:21:27 crc kubenswrapper[4771]: E0319 15:21:27.553594 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: failed to sync configmap cache: timed out waiting for the condition Mar 19 15:21:27 crc kubenswrapper[4771]: E0319 15:21:27.553602 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 19 15:21:27 crc kubenswrapper[4771]: E0319 15:21:27.553632 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: failed to sync configmap cache: timed out waiting for the condition Mar 19 15:21:27 crc kubenswrapper[4771]: E0319 15:21:27.553674 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 15:23:29.553649548 +0000 UTC m=+468.782270790 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : failed to sync configmap cache: timed out waiting for the condition Mar 19 15:21:27 crc kubenswrapper[4771]: E0319 15:21:27.553699 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 15:23:29.553687439 +0000 UTC m=+468.782308681 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : failed to sync configmap cache: timed out waiting for the condition Mar 19 15:21:28 crc kubenswrapper[4771]: W0319 15:21:28.208948 4771 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27267": dial tcp 38.102.83.50:6443: connect: connection refused Mar 19 15:21:28 crc kubenswrapper[4771]: E0319 15:21:28.209527 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27267\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 19 15:21:28 crc kubenswrapper[4771]: W0319 15:21:28.248044 4771 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27267": dial tcp 38.102.83.50:6443: connect: connection refused Mar 19 15:21:28 crc kubenswrapper[4771]: E0319 15:21:28.248163 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27267\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 19 15:21:28 crc kubenswrapper[4771]: W0319 15:21:28.605187 4771 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27267": dial tcp 38.102.83.50:6443: connect: connection refused Mar 19 15:21:28 crc kubenswrapper[4771]: E0319 15:21:28.606276 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27267\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 19 15:21:28 crc kubenswrapper[4771]: W0319 15:21:28.717337 4771 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27267": dial tcp 38.102.83.50:6443: connect: connection refused Mar 19 15:21:28 crc kubenswrapper[4771]: E0319 15:21:28.717452 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27267\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 19 15:21:30 crc kubenswrapper[4771]: E0319 15:21:30.379604 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="6.4s" Mar 19 15:21:30 crc kubenswrapper[4771]: E0319 15:21:30.802048 4771 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.50:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e47487eeedf90 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 15:21:19.923355536 +0000 UTC m=+339.151976738,LastTimestamp:2026-03-19 15:21:19.923355536 +0000 UTC m=+339.151976738,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 15:21:31 crc kubenswrapper[4771]: I0319 15:21:31.507769 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:21:31 crc kubenswrapper[4771]: I0319 15:21:31.510792 4771 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:31 crc kubenswrapper[4771]: I0319 15:21:31.511397 4771 status_manager.go:851] "Failed to get status for pod" podUID="6730def2-f246-44ea-9ee9-93f4e490008d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:31 crc kubenswrapper[4771]: I0319 15:21:31.512365 4771 status_manager.go:851] "Failed to get status for pod" podUID="6730def2-f246-44ea-9ee9-93f4e490008d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:31 crc kubenswrapper[4771]: I0319 15:21:31.513901 4771 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:31 crc kubenswrapper[4771]: I0319 15:21:31.535749 4771 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3f231f29-5fc5-412c-ae86-574ab06a1fac" Mar 19 15:21:31 crc kubenswrapper[4771]: I0319 15:21:31.535797 4771 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3f231f29-5fc5-412c-ae86-574ab06a1fac" Mar 19 15:21:31 crc kubenswrapper[4771]: E0319 15:21:31.536281 4771 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:21:31 crc kubenswrapper[4771]: I0319 15:21:31.536814 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:21:31 crc kubenswrapper[4771]: W0319 15:21:31.582129 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-c6556205f867a41e60adc955a672a6b16b5be1337acd5d7ec348bc42c3aeaae7 WatchSource:0}: Error finding container c6556205f867a41e60adc955a672a6b16b5be1337acd5d7ec348bc42c3aeaae7: Status 404 returned error can't find the container with id c6556205f867a41e60adc955a672a6b16b5be1337acd5d7ec348bc42c3aeaae7 Mar 19 15:21:31 crc kubenswrapper[4771]: W0319 15:21:31.676729 4771 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27267": dial tcp 38.102.83.50:6443: connect: connection refused Mar 19 15:21:31 crc kubenswrapper[4771]: E0319 15:21:31.676804 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27267\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 19 15:21:31 crc kubenswrapper[4771]: I0319 15:21:31.854554 4771 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="6a8d8d93eab974aaafa21491525a300b39ef2e1a269bdfaec3165d4bb8a39e23" exitCode=0 Mar 19 15:21:31 crc kubenswrapper[4771]: I0319 15:21:31.854645 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"6a8d8d93eab974aaafa21491525a300b39ef2e1a269bdfaec3165d4bb8a39e23"} Mar 19 15:21:31 crc kubenswrapper[4771]: I0319 15:21:31.854912 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c6556205f867a41e60adc955a672a6b16b5be1337acd5d7ec348bc42c3aeaae7"} Mar 19 15:21:31 crc kubenswrapper[4771]: I0319 15:21:31.855167 4771 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3f231f29-5fc5-412c-ae86-574ab06a1fac" Mar 19 15:21:31 crc kubenswrapper[4771]: I0319 15:21:31.855182 4771 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3f231f29-5fc5-412c-ae86-574ab06a1fac" Mar 19 15:21:31 crc kubenswrapper[4771]: E0319 15:21:31.855578 4771 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:21:31 crc kubenswrapper[4771]: I0319 15:21:31.855634 4771 status_manager.go:851] "Failed to get status for pod" podUID="6730def2-f246-44ea-9ee9-93f4e490008d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:31 crc kubenswrapper[4771]: I0319 15:21:31.855887 4771 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 19 15:21:31 crc kubenswrapper[4771]: W0319 15:21:31.955264 4771 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27267": dial tcp 38.102.83.50:6443: connect: connection refused Mar 19 15:21:31 crc kubenswrapper[4771]: E0319 15:21:31.955369 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27267\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 19 15:21:32 crc kubenswrapper[4771]: W0319 15:21:32.097294 4771 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27267": dial tcp 38.102.83.50:6443: connect: connection refused Mar 19 15:21:32 crc kubenswrapper[4771]: E0319 15:21:32.097373 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27267\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 19 15:21:32 crc kubenswrapper[4771]: I0319 15:21:32.879485 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"59a5d7acef18c1683d0c1ce8583ae058a441819c774e51b9cf311e914ada64cc"} Mar 19 15:21:32 crc kubenswrapper[4771]: I0319 15:21:32.879547 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"39930626a50f604fb8c77997f847a4d6f7c4bcb91a451eb4c5010b0e39adcf1d"} Mar 19 15:21:32 crc kubenswrapper[4771]: I0319 15:21:32.879558 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f6c41f6ab8d0fd36ff8b24f15151c3d8eb73f1128610107850ba172719f4e258"} Mar 19 15:21:33 crc kubenswrapper[4771]: I0319 15:21:33.889435 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"463d59e18430a4a9af51c86c4eb956fea040dacc9df886baf50b687b4a78ea83"} Mar 19 15:21:33 crc kubenswrapper[4771]: I0319 15:21:33.889805 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"17adcf8192de0d9ccd26b60801c4790b932d4418911e20a54de34f5d2dd0700c"} Mar 19 15:21:33 crc kubenswrapper[4771]: I0319 15:21:33.889834 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:21:33 crc kubenswrapper[4771]: I0319 15:21:33.889897 4771 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3f231f29-5fc5-412c-ae86-574ab06a1fac" Mar 19 15:21:33 crc kubenswrapper[4771]: I0319 15:21:33.889937 4771 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3f231f29-5fc5-412c-ae86-574ab06a1fac" Mar 19 15:21:34 crc kubenswrapper[4771]: I0319 15:21:34.897559 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 15:21:34 crc kubenswrapper[4771]: I0319 15:21:34.898817 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 19 15:21:34 crc kubenswrapper[4771]: I0319 15:21:34.898903 4771 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="dcc1650f4cc184b940a4fad0a9d7c1d593ece8735e59aed1c66f4417c2b862e4" exitCode=1 Mar 19 15:21:34 crc kubenswrapper[4771]: I0319 15:21:34.898959 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"dcc1650f4cc184b940a4fad0a9d7c1d593ece8735e59aed1c66f4417c2b862e4"} Mar 19 15:21:34 crc kubenswrapper[4771]: I0319 15:21:34.899619 4771 scope.go:117] "RemoveContainer" containerID="dcc1650f4cc184b940a4fad0a9d7c1d593ece8735e59aed1c66f4417c2b862e4" Mar 19 15:21:35 crc kubenswrapper[4771]: I0319 15:21:35.913244 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 15:21:35 crc kubenswrapper[4771]: I0319 15:21:35.914384 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 19 15:21:35 crc kubenswrapper[4771]: I0319 15:21:35.914462 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6a5097f8b0ce3cbb88a068999f89e34b51cc6a2b7b1ccc3590f5f3fb340ed38b"} Mar 19 15:21:36 crc kubenswrapper[4771]: I0319 15:21:36.537046 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:21:36 crc kubenswrapper[4771]: I0319 15:21:36.537116 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:21:36 crc kubenswrapper[4771]: I0319 15:21:36.546576 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:21:36 crc kubenswrapper[4771]: I0319 15:21:36.674382 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 15:21:37 crc kubenswrapper[4771]: I0319 15:21:37.393628 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 15:21:37 crc kubenswrapper[4771]: I0319 15:21:37.399594 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 15:21:38 crc kubenswrapper[4771]: I0319 15:21:38.839330 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 15:21:38 crc kubenswrapper[4771]: I0319 15:21:38.903289 4771 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:21:38 crc kubenswrapper[4771]: I0319 15:21:38.934591 4771 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3f231f29-5fc5-412c-ae86-574ab06a1fac" Mar 19 15:21:38 crc kubenswrapper[4771]: I0319 15:21:38.934621 4771 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3f231f29-5fc5-412c-ae86-574ab06a1fac" Mar 19 15:21:38 crc kubenswrapper[4771]: I0319 15:21:38.939030 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:21:39 crc kubenswrapper[4771]: I0319 15:21:39.514685 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 15:21:39 crc kubenswrapper[4771]: I0319 15:21:39.940299 4771 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3f231f29-5fc5-412c-ae86-574ab06a1fac" Mar 19 15:21:39 crc kubenswrapper[4771]: I0319 15:21:39.940339 4771 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3f231f29-5fc5-412c-ae86-574ab06a1fac" Mar 19 15:21:40 crc kubenswrapper[4771]: I0319 15:21:40.493568 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 15:21:40 crc kubenswrapper[4771]: E0319 15:21:40.526397 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-s2dwl], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 15:21:40 crc kubenswrapper[4771]: E0319 15:21:40.545840 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cqllr], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 15:21:40 crc kubenswrapper[4771]: E0319 15:21:40.551895 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-zjhnk" podUID="7fb3bb21-b72b-45e1-9b87-73f281abba90" Mar 19 15:21:41 crc kubenswrapper[4771]: E0319 15:21:41.527983 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert nginx-conf], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 15:21:41 crc kubenswrapper[4771]: I0319 15:21:41.568854 4771 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="7003a16d-d77a-4207-95f4-28cb73c8c824" Mar 19 15:21:44 crc kubenswrapper[4771]: I0319 15:21:44.301613 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 15:21:44 crc kubenswrapper[4771]: I0319 15:21:44.915049 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 19 15:21:46 crc kubenswrapper[4771]: I0319 15:21:46.545287 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs\") pod \"network-metrics-daemon-zjhnk\" (UID: \"7fb3bb21-b72b-45e1-9b87-73f281abba90\") " pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:21:46 crc kubenswrapper[4771]: I0319 15:21:46.554238 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7fb3bb21-b72b-45e1-9b87-73f281abba90-metrics-certs\") pod \"network-metrics-daemon-zjhnk\" (UID: \"7fb3bb21-b72b-45e1-9b87-73f281abba90\") " pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:21:46 crc kubenswrapper[4771]: I0319 15:21:46.644802 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 19 15:21:46 crc kubenswrapper[4771]: I0319 15:21:46.680821 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 15:21:49 crc kubenswrapper[4771]: I0319 15:21:49.734925 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 15:21:50 crc kubenswrapper[4771]: I0319 15:21:50.062703 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 15:21:50 crc kubenswrapper[4771]: I0319 15:21:50.876616 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 15:21:50 crc kubenswrapper[4771]: I0319 15:21:50.879350 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 19 15:21:51 crc kubenswrapper[4771]: I0319 15:21:51.300476 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 19 15:21:51 crc kubenswrapper[4771]: I0319 15:21:51.414318 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 15:21:51 crc kubenswrapper[4771]: I0319 15:21:51.508129 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:21:51 crc kubenswrapper[4771]: I0319 15:21:51.518640 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 19 15:21:51 crc kubenswrapper[4771]: I0319 15:21:51.528108 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjhnk" Mar 19 15:21:51 crc kubenswrapper[4771]: I0319 15:21:51.623287 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 15:21:51 crc kubenswrapper[4771]: I0319 15:21:51.654326 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 19 15:21:51 crc kubenswrapper[4771]: I0319 15:21:51.730400 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 15:21:51 crc kubenswrapper[4771]: I0319 15:21:51.731013 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 15:21:51 crc kubenswrapper[4771]: I0319 15:21:51.756348 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 15:21:51 crc kubenswrapper[4771]: I0319 15:21:51.792904 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 15:21:52 crc kubenswrapper[4771]: I0319 15:21:52.012863 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 15:21:52 crc kubenswrapper[4771]: I0319 15:21:52.022512 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" event={"ID":"7fb3bb21-b72b-45e1-9b87-73f281abba90","Type":"ContainerStarted","Data":"28286e036f561c146450850f483d700c28659a17f88fd3ea18a77a9c557ddcad"} Mar 19 15:21:52 crc kubenswrapper[4771]: I0319 15:21:52.340409 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 15:21:52 crc kubenswrapper[4771]: I0319 15:21:52.371925 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 15:21:52 crc kubenswrapper[4771]: I0319 15:21:52.508716 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:21:52 crc kubenswrapper[4771]: I0319 15:21:52.509269 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:21:52 crc kubenswrapper[4771]: I0319 15:21:52.525940 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 15:21:52 crc kubenswrapper[4771]: I0319 15:21:52.719197 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 15:21:52 crc kubenswrapper[4771]: I0319 15:21:52.742350 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 15:21:52 crc kubenswrapper[4771]: I0319 15:21:52.792173 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 15:21:52 crc kubenswrapper[4771]: I0319 15:21:52.803472 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 15:21:52 crc kubenswrapper[4771]: I0319 15:21:52.803565 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 15:21:52 crc kubenswrapper[4771]: I0319 15:21:52.972666 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 19 15:21:53 crc kubenswrapper[4771]: I0319 15:21:53.027435 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 15:21:53 crc kubenswrapper[4771]: I0319 15:21:53.043095 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" event={"ID":"7fb3bb21-b72b-45e1-9b87-73f281abba90","Type":"ContainerStarted","Data":"d7ea981060cf7f0b7689364c62ff02041cabc1f5b353b59f36151738c4f30736"} Mar 19 15:21:53 crc kubenswrapper[4771]: I0319 15:21:53.043154 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zjhnk" event={"ID":"7fb3bb21-b72b-45e1-9b87-73f281abba90","Type":"ContainerStarted","Data":"1013dbe368ae3162f6385aaa2ed8caddb03c811d5fe2e83f9abd8442924247a5"} Mar 19 15:21:53 crc kubenswrapper[4771]: I0319 15:21:53.283594 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 15:21:53 crc kubenswrapper[4771]: I0319 15:21:53.446625 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 15:21:53 crc kubenswrapper[4771]: I0319 15:21:53.473370 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 15:21:53 crc kubenswrapper[4771]: I0319 15:21:53.572428 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 15:21:53 crc kubenswrapper[4771]: I0319 15:21:53.600076 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 15:21:53 crc kubenswrapper[4771]: I0319 15:21:53.600512 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 19 15:21:53 crc kubenswrapper[4771]: I0319 15:21:53.656923 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 19 15:21:53 crc kubenswrapper[4771]: I0319 15:21:53.672399 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 19 15:21:53 crc kubenswrapper[4771]: I0319 15:21:53.726801 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 19 15:21:53 crc kubenswrapper[4771]: I0319 15:21:53.832205 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 15:21:53 crc kubenswrapper[4771]: I0319 15:21:53.939282 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 15:21:53 crc kubenswrapper[4771]: I0319 15:21:53.982227 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 15:21:53 crc kubenswrapper[4771]: I0319 15:21:53.984951 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 15:21:54 crc kubenswrapper[4771]: I0319 15:21:54.114740 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 15:21:54 crc kubenswrapper[4771]: I0319 15:21:54.126637 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 19 15:21:54 crc kubenswrapper[4771]: I0319 15:21:54.140717 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 15:21:54 crc kubenswrapper[4771]: I0319 15:21:54.184316 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 15:21:54 crc kubenswrapper[4771]: I0319 15:21:54.248650 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 15:21:54 crc kubenswrapper[4771]: I0319 15:21:54.363158 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 15:21:54 crc kubenswrapper[4771]: I0319 15:21:54.440946 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 15:21:54 crc kubenswrapper[4771]: I0319 15:21:54.465297 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 15:21:54 crc kubenswrapper[4771]: I0319 15:21:54.482340 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 15:21:54 crc kubenswrapper[4771]: I0319 15:21:54.500182 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 19 15:21:54 crc kubenswrapper[4771]: I0319 15:21:54.509250 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:21:54 crc kubenswrapper[4771]: I0319 15:21:54.525057 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 15:21:54 crc kubenswrapper[4771]: I0319 15:21:54.544146 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 15:21:54 crc kubenswrapper[4771]: I0319 15:21:54.604742 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 15:21:54 crc kubenswrapper[4771]: I0319 15:21:54.615804 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 19 15:21:54 crc kubenswrapper[4771]: I0319 15:21:54.652608 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 19 15:21:54 crc kubenswrapper[4771]: I0319 15:21:54.837033 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 15:21:54 crc kubenswrapper[4771]: I0319 15:21:54.944706 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 15:21:55 crc kubenswrapper[4771]: I0319 15:21:55.023249 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 15:21:55 crc kubenswrapper[4771]: I0319 15:21:55.023461 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 19 15:21:55 crc kubenswrapper[4771]: I0319 15:21:55.073497 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 15:21:55 crc kubenswrapper[4771]: I0319 15:21:55.191403 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 19 15:21:55 crc kubenswrapper[4771]: I0319 15:21:55.226370 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 15:21:55 crc kubenswrapper[4771]: I0319 15:21:55.263769 4771 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 15:21:55 crc kubenswrapper[4771]: I0319 15:21:55.384918 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 15:21:55 crc kubenswrapper[4771]: I0319 15:21:55.434390 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 15:21:55 crc kubenswrapper[4771]: I0319 15:21:55.481091 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 15:21:55 crc kubenswrapper[4771]: I0319 15:21:55.491362 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 19 15:21:55 crc kubenswrapper[4771]: I0319 15:21:55.508372 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 19 15:21:55 crc kubenswrapper[4771]: I0319 15:21:55.632274 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 15:21:55 crc kubenswrapper[4771]: I0319 15:21:55.661311 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 15:21:55 crc kubenswrapper[4771]: I0319 15:21:55.774162 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 15:21:55 crc kubenswrapper[4771]: I0319 15:21:55.928438 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 15:21:56 crc kubenswrapper[4771]: I0319 15:21:56.064792 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 19 15:21:56 crc kubenswrapper[4771]: I0319 15:21:56.188298 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 15:21:56 crc kubenswrapper[4771]: I0319 15:21:56.204884 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 19 15:21:56 crc kubenswrapper[4771]: I0319 15:21:56.241639 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 15:21:56 crc kubenswrapper[4771]: I0319 15:21:56.624557 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 15:21:56 crc kubenswrapper[4771]: I0319 15:21:56.624789 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 15:21:56 crc kubenswrapper[4771]: I0319 15:21:56.624823 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 15:21:56 crc kubenswrapper[4771]: I0319 15:21:56.643948 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 15:21:56 crc kubenswrapper[4771]: I0319 15:21:56.728696 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 19 15:21:56 crc kubenswrapper[4771]: I0319 15:21:56.743205 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 15:21:56 crc kubenswrapper[4771]: I0319 15:21:56.784934 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 15:21:56 crc kubenswrapper[4771]: I0319 15:21:56.828810 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 15:21:56 crc kubenswrapper[4771]: I0319 15:21:56.834715 4771 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 15:21:56 crc kubenswrapper[4771]: I0319 15:21:56.887103 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 15:21:56 crc kubenswrapper[4771]: I0319 15:21:56.902320 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 15:21:56 crc kubenswrapper[4771]: I0319 15:21:56.990403 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 15:21:57 crc kubenswrapper[4771]: I0319 15:21:57.074314 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 15:21:57 crc kubenswrapper[4771]: I0319 15:21:57.148590 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 15:21:57 crc kubenswrapper[4771]: I0319 15:21:57.221303 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 15:21:57 crc kubenswrapper[4771]: I0319 15:21:57.282479 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 15:21:57 crc kubenswrapper[4771]: I0319 15:21:57.304317 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 15:21:57 crc kubenswrapper[4771]: I0319 15:21:57.325446 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 19 15:21:57 crc kubenswrapper[4771]: I0319 15:21:57.380283 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 15:21:57 crc kubenswrapper[4771]: I0319 15:21:57.388553 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 15:21:57 crc kubenswrapper[4771]: I0319 15:21:57.523544 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 15:21:57 crc kubenswrapper[4771]: I0319 15:21:57.553408 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 15:21:57 crc kubenswrapper[4771]: I0319 15:21:57.604661 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 15:21:57 crc kubenswrapper[4771]: I0319 15:21:57.617452 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 15:21:57 crc kubenswrapper[4771]: I0319 15:21:57.780598 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 15:21:57 crc kubenswrapper[4771]: I0319 15:21:57.916543 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 15:21:57 crc kubenswrapper[4771]: I0319 15:21:57.951420 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 15:21:58 crc kubenswrapper[4771]: I0319 15:21:58.114604 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 15:21:58 crc kubenswrapper[4771]: I0319 15:21:58.121100 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 15:21:58 crc kubenswrapper[4771]: I0319 15:21:58.134215 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 15:21:58 crc kubenswrapper[4771]: I0319 15:21:58.146107 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 15:21:58 crc kubenswrapper[4771]: I0319 15:21:58.263440 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 15:21:58 crc kubenswrapper[4771]: I0319 15:21:58.350651 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 15:21:58 crc kubenswrapper[4771]: I0319 15:21:58.371735 4771 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 19 15:21:58 crc kubenswrapper[4771]: I0319 15:21:58.626063 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 15:21:58 crc kubenswrapper[4771]: I0319 15:21:58.672341 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 19 15:21:58 crc kubenswrapper[4771]: I0319 15:21:58.689938 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 15:21:58 crc kubenswrapper[4771]: I0319 15:21:58.692551 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 15:21:58 crc kubenswrapper[4771]: I0319 15:21:58.786732 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 15:21:58 crc kubenswrapper[4771]: I0319 15:21:58.787625 4771 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 15:21:58 crc kubenswrapper[4771]: I0319 15:21:58.828894 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 15:21:58 crc kubenswrapper[4771]: I0319 15:21:58.864612 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 15:21:58 crc kubenswrapper[4771]: I0319 15:21:58.878725 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 15:21:58 crc kubenswrapper[4771]: I0319 15:21:58.883644 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 15:21:58 crc kubenswrapper[4771]: I0319 15:21:58.907104 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 15:21:59 crc kubenswrapper[4771]: I0319 15:21:59.025736 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 15:21:59 crc kubenswrapper[4771]: I0319 15:21:59.079469 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 15:21:59 crc kubenswrapper[4771]: I0319 15:21:59.083050 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 15:21:59 crc kubenswrapper[4771]: I0319 15:21:59.327458 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 19 15:21:59 crc kubenswrapper[4771]: I0319 15:21:59.374073 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 15:21:59 crc kubenswrapper[4771]: I0319 15:21:59.395111 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 15:21:59 crc kubenswrapper[4771]: I0319 15:21:59.497908 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 15:21:59 crc kubenswrapper[4771]: I0319 15:21:59.506128 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 15:21:59 crc kubenswrapper[4771]: I0319 15:21:59.609856 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 15:21:59 crc kubenswrapper[4771]: I0319 15:21:59.676088 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 15:21:59 crc kubenswrapper[4771]: I0319 15:21:59.709945 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 15:21:59 crc kubenswrapper[4771]: I0319 15:21:59.733434 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 15:21:59 crc kubenswrapper[4771]: I0319 15:21:59.742629 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 15:21:59 crc kubenswrapper[4771]: I0319 15:21:59.883600 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 15:21:59 crc kubenswrapper[4771]: I0319 15:21:59.890935 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 15:21:59 crc kubenswrapper[4771]: I0319 15:21:59.949309 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 15:22:00 crc kubenswrapper[4771]: I0319 15:22:00.001461 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 19 15:22:00 crc kubenswrapper[4771]: I0319 15:22:00.003437 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 19 15:22:00 crc kubenswrapper[4771]: I0319 15:22:00.125600 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 15:22:00 crc kubenswrapper[4771]: I0319 15:22:00.140931 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 19 15:22:00 crc kubenswrapper[4771]: I0319 15:22:00.216559 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 15:22:00 crc kubenswrapper[4771]: I0319 15:22:00.263751 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 15:22:00 crc kubenswrapper[4771]: I0319 15:22:00.360627 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 15:22:00 crc kubenswrapper[4771]: I0319 15:22:00.360888 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 19 15:22:00 crc kubenswrapper[4771]: I0319 15:22:00.387386 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 15:22:00 crc kubenswrapper[4771]: I0319 15:22:00.397099 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 15:22:00 crc kubenswrapper[4771]: I0319 15:22:00.427222 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 15:22:00 crc kubenswrapper[4771]: I0319 15:22:00.478600 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 15:22:00 crc kubenswrapper[4771]: I0319 15:22:00.479495 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 15:22:00 crc kubenswrapper[4771]: I0319 15:22:00.505411 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 15:22:00 crc kubenswrapper[4771]: I0319 15:22:00.565051 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 15:22:00 crc kubenswrapper[4771]: I0319 15:22:00.646594 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 15:22:00 crc kubenswrapper[4771]: I0319 15:22:00.730327 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 15:22:00 crc kubenswrapper[4771]: I0319 15:22:00.783404 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 15:22:00 crc kubenswrapper[4771]: I0319 15:22:00.786379 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 15:22:00 crc kubenswrapper[4771]: I0319 15:22:00.800103 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 19 15:22:00 crc kubenswrapper[4771]: I0319 15:22:00.876871 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 15:22:00 crc kubenswrapper[4771]: I0319 15:22:00.890587 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 19 15:22:00 crc kubenswrapper[4771]: I0319 15:22:00.898776 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 19 15:22:00 crc kubenswrapper[4771]: I0319 15:22:00.915562 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 15:22:00 crc kubenswrapper[4771]: I0319 15:22:00.918262 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 15:22:00 crc kubenswrapper[4771]: I0319 15:22:00.954136 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 19 15:22:01 crc kubenswrapper[4771]: I0319 15:22:01.071821 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 19 15:22:01 crc kubenswrapper[4771]: I0319 15:22:01.122701 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 15:22:01 crc kubenswrapper[4771]: I0319 15:22:01.197062 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 15:22:01 crc kubenswrapper[4771]: I0319 15:22:01.354945 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 15:22:01 crc kubenswrapper[4771]: I0319 15:22:01.377357 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 19 15:22:01 crc kubenswrapper[4771]: I0319 15:22:01.395427 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 19 15:22:01 crc kubenswrapper[4771]: I0319 15:22:01.443912 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 15:22:01 crc kubenswrapper[4771]: I0319 15:22:01.522801 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 19 15:22:01 crc kubenswrapper[4771]: I0319 15:22:01.554557 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 15:22:01 crc kubenswrapper[4771]: I0319 15:22:01.580680 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 15:22:01 crc kubenswrapper[4771]: I0319 15:22:01.610306 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 19 15:22:01 crc kubenswrapper[4771]: I0319 15:22:01.627789 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 19 15:22:01 crc kubenswrapper[4771]: I0319 15:22:01.664677 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 15:22:01 crc kubenswrapper[4771]: I0319 15:22:01.728499 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 15:22:01 crc kubenswrapper[4771]: I0319 15:22:01.760720 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 15:22:01 crc kubenswrapper[4771]: I0319 15:22:01.765575 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 15:22:01 crc kubenswrapper[4771]: I0319 15:22:01.799287 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 15:22:01 crc kubenswrapper[4771]: I0319 15:22:01.850056 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 19 15:22:01 crc kubenswrapper[4771]: I0319 15:22:01.991357 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 19 15:22:02 crc kubenswrapper[4771]: I0319 15:22:02.093434 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 15:22:02 crc kubenswrapper[4771]: I0319 15:22:02.208439 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 15:22:02 crc kubenswrapper[4771]: I0319 15:22:02.258659 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 15:22:02 crc kubenswrapper[4771]: I0319 15:22:02.269756 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 19 15:22:02 crc kubenswrapper[4771]: I0319 15:22:02.276774 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 15:22:02 crc kubenswrapper[4771]: I0319 15:22:02.348523 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 15:22:02 crc kubenswrapper[4771]: I0319 15:22:02.394838 4771 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 15:22:02 crc kubenswrapper[4771]: I0319 15:22:02.408502 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 15:22:02 crc kubenswrapper[4771]: I0319 15:22:02.477889 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 15:22:02 crc kubenswrapper[4771]: I0319 15:22:02.576881 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 19 15:22:02 crc kubenswrapper[4771]: I0319 15:22:02.579863 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 15:22:02 crc kubenswrapper[4771]: I0319 15:22:02.636491 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 19 15:22:02 crc kubenswrapper[4771]: I0319 15:22:02.667184 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 19 15:22:02 crc kubenswrapper[4771]: I0319 15:22:02.683581 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 15:22:02 crc kubenswrapper[4771]: I0319 15:22:02.689738 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 15:22:02 crc kubenswrapper[4771]: I0319 15:22:02.724609 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 19 15:22:02 crc kubenswrapper[4771]: I0319 15:22:02.791894 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 15:22:02 crc kubenswrapper[4771]: I0319 15:22:02.849729 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 19 15:22:03 crc kubenswrapper[4771]: I0319 15:22:03.015299 4771 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 15:22:03 crc kubenswrapper[4771]: I0319 15:22:03.018380 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=44.018366193 podStartE2EDuration="44.018366193s" podCreationTimestamp="2026-03-19 15:21:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:21:38.916133493 +0000 UTC m=+358.144754735" watchObservedRunningTime="2026-03-19 15:22:03.018366193 +0000 UTC m=+382.246987395" Mar 19 15:22:03 crc kubenswrapper[4771]: I0319 15:22:03.019809 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-zjhnk" podStartSLOduration=323.01980244 podStartE2EDuration="5m23.01980244s" podCreationTimestamp="2026-03-19 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:21:53.059686314 +0000 UTC m=+372.288307526" watchObservedRunningTime="2026-03-19 15:22:03.01980244 +0000 UTC m=+382.248423642" Mar 19 15:22:03 crc kubenswrapper[4771]: I0319 15:22:03.020114 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 15:22:03 crc kubenswrapper[4771]: I0319 15:22:03.020158 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 15:22:03 crc kubenswrapper[4771]: I0319 15:22:03.020181 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zjhnk"] Mar 19 15:22:03 crc kubenswrapper[4771]: I0319 15:22:03.024894 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 15:22:03 crc kubenswrapper[4771]: I0319 15:22:03.044829 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=25.044812503 podStartE2EDuration="25.044812503s" podCreationTimestamp="2026-03-19 15:21:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:22:03.041597328 +0000 UTC m=+382.270218530" watchObservedRunningTime="2026-03-19 15:22:03.044812503 +0000 UTC m=+382.273433705" Mar 19 15:22:03 crc kubenswrapper[4771]: I0319 15:22:03.056202 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 19 15:22:03 crc kubenswrapper[4771]: I0319 15:22:03.104948 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 15:22:03 crc kubenswrapper[4771]: I0319 15:22:03.231813 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 19 15:22:03 crc kubenswrapper[4771]: I0319 15:22:03.270978 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 15:22:03 crc kubenswrapper[4771]: I0319 15:22:03.274949 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 19 15:22:03 crc kubenswrapper[4771]: I0319 15:22:03.380162 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 15:22:03 crc kubenswrapper[4771]: I0319 15:22:03.442197 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 15:22:03 crc kubenswrapper[4771]: I0319 15:22:03.727409 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 15:22:03 crc kubenswrapper[4771]: I0319 15:22:03.747589 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 15:22:03 crc kubenswrapper[4771]: I0319 15:22:03.891279 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 19 15:22:03 crc kubenswrapper[4771]: I0319 15:22:03.897926 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 15:22:03 crc kubenswrapper[4771]: I0319 15:22:03.923293 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 15:22:04 crc kubenswrapper[4771]: I0319 15:22:04.000279 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 15:22:04 crc kubenswrapper[4771]: I0319 15:22:04.020656 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 15:22:04 crc kubenswrapper[4771]: I0319 15:22:04.039399 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 15:22:04 crc kubenswrapper[4771]: I0319 15:22:04.042178 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 19 15:22:04 crc kubenswrapper[4771]: I0319 15:22:04.163377 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 15:22:04 crc kubenswrapper[4771]: I0319 15:22:04.203975 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 15:22:04 crc kubenswrapper[4771]: I0319 15:22:04.229186 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 19 15:22:04 crc kubenswrapper[4771]: I0319 15:22:04.373093 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 15:22:04 crc kubenswrapper[4771]: I0319 15:22:04.380288 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 15:22:04 crc kubenswrapper[4771]: I0319 15:22:04.389655 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 15:22:04 crc kubenswrapper[4771]: I0319 15:22:04.495124 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 15:22:04 crc kubenswrapper[4771]: I0319 15:22:04.500069 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 15:22:04 crc kubenswrapper[4771]: I0319 15:22:04.548312 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 15:22:04 crc kubenswrapper[4771]: I0319 15:22:04.551451 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 19 15:22:04 crc kubenswrapper[4771]: I0319 15:22:04.613834 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 15:22:04 crc kubenswrapper[4771]: I0319 15:22:04.698424 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 19 15:22:04 crc kubenswrapper[4771]: I0319 15:22:04.779140 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 15:22:04 crc kubenswrapper[4771]: I0319 15:22:04.893132 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 15:22:04 crc kubenswrapper[4771]: I0319 15:22:04.922454 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 15:22:04 crc kubenswrapper[4771]: I0319 15:22:04.984707 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 19 15:22:04 crc kubenswrapper[4771]: I0319 15:22:04.992762 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 15:22:05 crc kubenswrapper[4771]: I0319 15:22:05.114584 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 15:22:05 crc kubenswrapper[4771]: I0319 15:22:05.273368 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 15:22:05 crc kubenswrapper[4771]: I0319 15:22:05.317486 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 19 15:22:05 crc kubenswrapper[4771]: I0319 15:22:05.318600 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 19 15:22:05 crc kubenswrapper[4771]: I0319 15:22:05.643169 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 15:22:06 crc kubenswrapper[4771]: I0319 15:22:06.086945 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 19 15:22:06 crc kubenswrapper[4771]: I0319 15:22:06.225943 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565562-68tfj"] Mar 19 15:22:06 crc kubenswrapper[4771]: E0319 15:22:06.226315 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6730def2-f246-44ea-9ee9-93f4e490008d" containerName="installer" Mar 19 15:22:06 crc kubenswrapper[4771]: I0319 15:22:06.226341 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6730def2-f246-44ea-9ee9-93f4e490008d" containerName="installer" Mar 19 15:22:06 crc kubenswrapper[4771]: I0319 15:22:06.226474 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="6730def2-f246-44ea-9ee9-93f4e490008d" containerName="installer" Mar 19 15:22:06 crc kubenswrapper[4771]: I0319 15:22:06.226945 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565562-68tfj" Mar 19 15:22:06 crc kubenswrapper[4771]: I0319 15:22:06.229377 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 15:22:06 crc kubenswrapper[4771]: I0319 15:22:06.230598 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 15:22:06 crc kubenswrapper[4771]: I0319 15:22:06.230742 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k42k7" Mar 19 15:22:06 crc kubenswrapper[4771]: I0319 15:22:06.237137 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565562-68tfj"] Mar 19 15:22:06 crc kubenswrapper[4771]: I0319 15:22:06.261893 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7wnr\" (UniqueName: \"kubernetes.io/projected/cdb11256-816a-4907-9746-56e259e4fd29-kube-api-access-l7wnr\") pod \"auto-csr-approver-29565562-68tfj\" (UID: \"cdb11256-816a-4907-9746-56e259e4fd29\") " pod="openshift-infra/auto-csr-approver-29565562-68tfj" Mar 19 15:22:06 crc kubenswrapper[4771]: I0319 15:22:06.324192 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 19 15:22:06 crc kubenswrapper[4771]: I0319 15:22:06.362755 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7wnr\" (UniqueName: \"kubernetes.io/projected/cdb11256-816a-4907-9746-56e259e4fd29-kube-api-access-l7wnr\") pod \"auto-csr-approver-29565562-68tfj\" (UID: \"cdb11256-816a-4907-9746-56e259e4fd29\") " pod="openshift-infra/auto-csr-approver-29565562-68tfj" Mar 19 15:22:06 crc kubenswrapper[4771]: I0319 15:22:06.379669 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7wnr\" (UniqueName: \"kubernetes.io/projected/cdb11256-816a-4907-9746-56e259e4fd29-kube-api-access-l7wnr\") pod \"auto-csr-approver-29565562-68tfj\" (UID: \"cdb11256-816a-4907-9746-56e259e4fd29\") " pod="openshift-infra/auto-csr-approver-29565562-68tfj" Mar 19 15:22:06 crc kubenswrapper[4771]: I0319 15:22:06.548552 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565562-68tfj" Mar 19 15:22:06 crc kubenswrapper[4771]: I0319 15:22:06.585519 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 15:22:06 crc kubenswrapper[4771]: I0319 15:22:06.631510 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 19 15:22:06 crc kubenswrapper[4771]: I0319 15:22:06.791924 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 15:22:06 crc kubenswrapper[4771]: I0319 15:22:06.835888 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 19 15:22:07 crc kubenswrapper[4771]: I0319 15:22:07.061045 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 15:22:07 crc kubenswrapper[4771]: I0319 15:22:07.070202 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565562-68tfj"] Mar 19 15:22:07 crc kubenswrapper[4771]: W0319 15:22:07.084161 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdb11256_816a_4907_9746_56e259e4fd29.slice/crio-3a43f3f65985c56262f959e4181217eef96109809f9e78a08d55665d558fe24e WatchSource:0}: Error finding container 3a43f3f65985c56262f959e4181217eef96109809f9e78a08d55665d558fe24e: Status 404 returned error can't find the container with id 3a43f3f65985c56262f959e4181217eef96109809f9e78a08d55665d558fe24e Mar 19 15:22:07 crc kubenswrapper[4771]: I0319 15:22:07.134875 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565562-68tfj" event={"ID":"cdb11256-816a-4907-9746-56e259e4fd29","Type":"ContainerStarted","Data":"3a43f3f65985c56262f959e4181217eef96109809f9e78a08d55665d558fe24e"} Mar 19 15:22:07 crc kubenswrapper[4771]: I0319 15:22:07.380220 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 15:22:07 crc kubenswrapper[4771]: I0319 15:22:07.417766 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 19 15:22:08 crc kubenswrapper[4771]: I0319 15:22:08.029306 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 15:22:09 crc kubenswrapper[4771]: I0319 15:22:09.149713 4771 generic.go:334] "Generic (PLEG): container finished" podID="cdb11256-816a-4907-9746-56e259e4fd29" containerID="dcc822adc2395f9f73e4d9667375ba7d2e57039a8088293b565abf4cf2474e0b" exitCode=0 Mar 19 15:22:09 crc kubenswrapper[4771]: I0319 15:22:09.149810 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565562-68tfj" event={"ID":"cdb11256-816a-4907-9746-56e259e4fd29","Type":"ContainerDied","Data":"dcc822adc2395f9f73e4d9667375ba7d2e57039a8088293b565abf4cf2474e0b"} Mar 19 15:22:10 crc kubenswrapper[4771]: I0319 15:22:10.539448 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565562-68tfj" Mar 19 15:22:10 crc kubenswrapper[4771]: I0319 15:22:10.721237 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7wnr\" (UniqueName: \"kubernetes.io/projected/cdb11256-816a-4907-9746-56e259e4fd29-kube-api-access-l7wnr\") pod \"cdb11256-816a-4907-9746-56e259e4fd29\" (UID: \"cdb11256-816a-4907-9746-56e259e4fd29\") " Mar 19 15:22:10 crc kubenswrapper[4771]: I0319 15:22:10.726052 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdb11256-816a-4907-9746-56e259e4fd29-kube-api-access-l7wnr" (OuterVolumeSpecName: "kube-api-access-l7wnr") pod "cdb11256-816a-4907-9746-56e259e4fd29" (UID: "cdb11256-816a-4907-9746-56e259e4fd29"). InnerVolumeSpecName "kube-api-access-l7wnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:22:10 crc kubenswrapper[4771]: I0319 15:22:10.822242 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7wnr\" (UniqueName: \"kubernetes.io/projected/cdb11256-816a-4907-9746-56e259e4fd29-kube-api-access-l7wnr\") on node \"crc\" DevicePath \"\"" Mar 19 15:22:11 crc kubenswrapper[4771]: I0319 15:22:11.162980 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565562-68tfj" event={"ID":"cdb11256-816a-4907-9746-56e259e4fd29","Type":"ContainerDied","Data":"3a43f3f65985c56262f959e4181217eef96109809f9e78a08d55665d558fe24e"} Mar 19 15:22:11 crc kubenswrapper[4771]: I0319 15:22:11.163033 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a43f3f65985c56262f959e4181217eef96109809f9e78a08d55665d558fe24e" Mar 19 15:22:11 crc kubenswrapper[4771]: I0319 15:22:11.163088 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565562-68tfj" Mar 19 15:22:12 crc kubenswrapper[4771]: I0319 15:22:12.565841 4771 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 15:22:12 crc kubenswrapper[4771]: I0319 15:22:12.566165 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://9bd0cb88dd917a6b19b969bb2c31584f8beebc52576d700c3ba9969ff0995a2d" gracePeriod=5 Mar 19 15:22:18 crc kubenswrapper[4771]: I0319 15:22:18.167802 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 19 15:22:18 crc kubenswrapper[4771]: I0319 15:22:18.169146 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 15:22:18 crc kubenswrapper[4771]: I0319 15:22:18.198031 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 19 15:22:18 crc kubenswrapper[4771]: I0319 15:22:18.198125 4771 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="9bd0cb88dd917a6b19b969bb2c31584f8beebc52576d700c3ba9969ff0995a2d" exitCode=137 Mar 19 15:22:18 crc kubenswrapper[4771]: I0319 15:22:18.198187 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 15:22:18 crc kubenswrapper[4771]: I0319 15:22:18.198206 4771 scope.go:117] "RemoveContainer" containerID="9bd0cb88dd917a6b19b969bb2c31584f8beebc52576d700c3ba9969ff0995a2d" Mar 19 15:22:18 crc kubenswrapper[4771]: I0319 15:22:18.214653 4771 scope.go:117] "RemoveContainer" containerID="9bd0cb88dd917a6b19b969bb2c31584f8beebc52576d700c3ba9969ff0995a2d" Mar 19 15:22:18 crc kubenswrapper[4771]: E0319 15:22:18.215224 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bd0cb88dd917a6b19b969bb2c31584f8beebc52576d700c3ba9969ff0995a2d\": container with ID starting with 9bd0cb88dd917a6b19b969bb2c31584f8beebc52576d700c3ba9969ff0995a2d not found: ID does not exist" containerID="9bd0cb88dd917a6b19b969bb2c31584f8beebc52576d700c3ba9969ff0995a2d" Mar 19 15:22:18 crc kubenswrapper[4771]: I0319 15:22:18.215293 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bd0cb88dd917a6b19b969bb2c31584f8beebc52576d700c3ba9969ff0995a2d"} err="failed to get container status \"9bd0cb88dd917a6b19b969bb2c31584f8beebc52576d700c3ba9969ff0995a2d\": rpc error: code = NotFound desc = could not find container \"9bd0cb88dd917a6b19b969bb2c31584f8beebc52576d700c3ba9969ff0995a2d\": container with ID starting with 9bd0cb88dd917a6b19b969bb2c31584f8beebc52576d700c3ba9969ff0995a2d not found: ID does not exist" Mar 19 15:22:18 crc kubenswrapper[4771]: I0319 15:22:18.229177 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 15:22:18 crc kubenswrapper[4771]: I0319 15:22:18.229248 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 15:22:18 crc kubenswrapper[4771]: I0319 15:22:18.229280 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 15:22:18 crc kubenswrapper[4771]: I0319 15:22:18.229277 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:22:18 crc kubenswrapper[4771]: I0319 15:22:18.229324 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:22:18 crc kubenswrapper[4771]: I0319 15:22:18.229324 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:22:18 crc kubenswrapper[4771]: I0319 15:22:18.229668 4771 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 19 15:22:18 crc kubenswrapper[4771]: I0319 15:22:18.229692 4771 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 19 15:22:18 crc kubenswrapper[4771]: I0319 15:22:18.229703 4771 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 19 15:22:18 crc kubenswrapper[4771]: I0319 15:22:18.330287 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 15:22:18 crc kubenswrapper[4771]: I0319 15:22:18.330327 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 15:22:18 crc kubenswrapper[4771]: I0319 15:22:18.330516 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:22:18 crc kubenswrapper[4771]: I0319 15:22:18.341304 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:22:18 crc kubenswrapper[4771]: I0319 15:22:18.431457 4771 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 19 15:22:18 crc kubenswrapper[4771]: I0319 15:22:18.431525 4771 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 19 15:22:19 crc kubenswrapper[4771]: I0319 15:22:19.516521 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 19 15:22:19 crc kubenswrapper[4771]: I0319 15:22:19.517310 4771 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 19 15:22:19 crc kubenswrapper[4771]: I0319 15:22:19.529581 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 15:22:19 crc kubenswrapper[4771]: I0319 15:22:19.529633 4771 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="03fad688-afd5-4dd0-8a39-727cf8deca37" Mar 19 15:22:19 crc kubenswrapper[4771]: I0319 15:22:19.532703 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 15:22:19 crc kubenswrapper[4771]: I0319 15:22:19.532740 4771 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="03fad688-afd5-4dd0-8a39-727cf8deca37" Mar 19 15:22:22 crc kubenswrapper[4771]: I0319 15:22:22.224598 4771 generic.go:334] "Generic (PLEG): container finished" podID="b4eb7061-dde4-44f1-943a-219d2f4f5071" containerID="e5431da28efcb15adba752f5d58f4db6fdc61a9c5d1fde67d2e0b8a038714652" exitCode=0 Mar 19 15:22:22 crc kubenswrapper[4771]: I0319 15:22:22.224663 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7fqvf" event={"ID":"b4eb7061-dde4-44f1-943a-219d2f4f5071","Type":"ContainerDied","Data":"e5431da28efcb15adba752f5d58f4db6fdc61a9c5d1fde67d2e0b8a038714652"} Mar 19 15:22:22 crc kubenswrapper[4771]: I0319 15:22:22.225152 4771 scope.go:117] "RemoveContainer" containerID="e5431da28efcb15adba752f5d58f4db6fdc61a9c5d1fde67d2e0b8a038714652" Mar 19 15:22:23 crc kubenswrapper[4771]: I0319 15:22:23.236090 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7fqvf" event={"ID":"b4eb7061-dde4-44f1-943a-219d2f4f5071","Type":"ContainerStarted","Data":"03f9d4d9e2995f09f3acd59a40aa524141cd7d6e2f0fe7d530da3a0b07185ea6"} Mar 19 15:22:23 crc kubenswrapper[4771]: I0319 15:22:23.236946 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-7fqvf" Mar 19 15:22:23 crc kubenswrapper[4771]: I0319 15:22:23.242537 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-7fqvf" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.269213 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tdkbr"] Mar 19 15:23:14 crc kubenswrapper[4771]: E0319 15:23:14.270101 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdb11256-816a-4907-9746-56e259e4fd29" containerName="oc" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.270122 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdb11256-816a-4907-9746-56e259e4fd29" containerName="oc" Mar 19 15:23:14 crc kubenswrapper[4771]: E0319 15:23:14.270142 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.270155 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.270354 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdb11256-816a-4907-9746-56e259e4fd29" containerName="oc" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.270376 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.271076 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.288879 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tdkbr"] Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.385500 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d381c12a-2fec-406d-a9b9-e60b41ca9ba0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tdkbr\" (UID: \"d381c12a-2fec-406d-a9b9-e60b41ca9ba0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.385577 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d381c12a-2fec-406d-a9b9-e60b41ca9ba0-bound-sa-token\") pod \"image-registry-66df7c8f76-tdkbr\" (UID: \"d381c12a-2fec-406d-a9b9-e60b41ca9ba0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.385761 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d381c12a-2fec-406d-a9b9-e60b41ca9ba0-registry-certificates\") pod \"image-registry-66df7c8f76-tdkbr\" (UID: \"d381c12a-2fec-406d-a9b9-e60b41ca9ba0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.385824 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d381c12a-2fec-406d-a9b9-e60b41ca9ba0-trusted-ca\") pod \"image-registry-66df7c8f76-tdkbr\" (UID: \"d381c12a-2fec-406d-a9b9-e60b41ca9ba0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.385901 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-tdkbr\" (UID: \"d381c12a-2fec-406d-a9b9-e60b41ca9ba0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.385936 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d381c12a-2fec-406d-a9b9-e60b41ca9ba0-registry-tls\") pod \"image-registry-66df7c8f76-tdkbr\" (UID: \"d381c12a-2fec-406d-a9b9-e60b41ca9ba0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.385957 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5tkq\" (UniqueName: \"kubernetes.io/projected/d381c12a-2fec-406d-a9b9-e60b41ca9ba0-kube-api-access-l5tkq\") pod \"image-registry-66df7c8f76-tdkbr\" (UID: \"d381c12a-2fec-406d-a9b9-e60b41ca9ba0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.386008 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d381c12a-2fec-406d-a9b9-e60b41ca9ba0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tdkbr\" (UID: \"d381c12a-2fec-406d-a9b9-e60b41ca9ba0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.421834 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-tdkbr\" (UID: \"d381c12a-2fec-406d-a9b9-e60b41ca9ba0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.487334 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d381c12a-2fec-406d-a9b9-e60b41ca9ba0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tdkbr\" (UID: \"d381c12a-2fec-406d-a9b9-e60b41ca9ba0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.487403 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d381c12a-2fec-406d-a9b9-e60b41ca9ba0-bound-sa-token\") pod \"image-registry-66df7c8f76-tdkbr\" (UID: \"d381c12a-2fec-406d-a9b9-e60b41ca9ba0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.487464 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d381c12a-2fec-406d-a9b9-e60b41ca9ba0-registry-certificates\") pod \"image-registry-66df7c8f76-tdkbr\" (UID: \"d381c12a-2fec-406d-a9b9-e60b41ca9ba0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.487507 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d381c12a-2fec-406d-a9b9-e60b41ca9ba0-trusted-ca\") pod \"image-registry-66df7c8f76-tdkbr\" (UID: \"d381c12a-2fec-406d-a9b9-e60b41ca9ba0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.487560 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d381c12a-2fec-406d-a9b9-e60b41ca9ba0-registry-tls\") pod \"image-registry-66df7c8f76-tdkbr\" (UID: \"d381c12a-2fec-406d-a9b9-e60b41ca9ba0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.487592 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5tkq\" (UniqueName: \"kubernetes.io/projected/d381c12a-2fec-406d-a9b9-e60b41ca9ba0-kube-api-access-l5tkq\") pod \"image-registry-66df7c8f76-tdkbr\" (UID: \"d381c12a-2fec-406d-a9b9-e60b41ca9ba0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.487656 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d381c12a-2fec-406d-a9b9-e60b41ca9ba0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tdkbr\" (UID: \"d381c12a-2fec-406d-a9b9-e60b41ca9ba0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.488543 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d381c12a-2fec-406d-a9b9-e60b41ca9ba0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tdkbr\" (UID: \"d381c12a-2fec-406d-a9b9-e60b41ca9ba0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.489749 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d381c12a-2fec-406d-a9b9-e60b41ca9ba0-registry-certificates\") pod \"image-registry-66df7c8f76-tdkbr\" (UID: \"d381c12a-2fec-406d-a9b9-e60b41ca9ba0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.491197 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d381c12a-2fec-406d-a9b9-e60b41ca9ba0-trusted-ca\") pod \"image-registry-66df7c8f76-tdkbr\" (UID: \"d381c12a-2fec-406d-a9b9-e60b41ca9ba0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.494089 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d381c12a-2fec-406d-a9b9-e60b41ca9ba0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tdkbr\" (UID: \"d381c12a-2fec-406d-a9b9-e60b41ca9ba0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.501737 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d381c12a-2fec-406d-a9b9-e60b41ca9ba0-registry-tls\") pod \"image-registry-66df7c8f76-tdkbr\" (UID: \"d381c12a-2fec-406d-a9b9-e60b41ca9ba0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.517267 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d381c12a-2fec-406d-a9b9-e60b41ca9ba0-bound-sa-token\") pod \"image-registry-66df7c8f76-tdkbr\" (UID: \"d381c12a-2fec-406d-a9b9-e60b41ca9ba0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.520745 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5tkq\" (UniqueName: \"kubernetes.io/projected/d381c12a-2fec-406d-a9b9-e60b41ca9ba0-kube-api-access-l5tkq\") pod \"image-registry-66df7c8f76-tdkbr\" (UID: \"d381c12a-2fec-406d-a9b9-e60b41ca9ba0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.594649 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" Mar 19 15:23:14 crc kubenswrapper[4771]: I0319 15:23:14.840816 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tdkbr"] Mar 19 15:23:15 crc kubenswrapper[4771]: I0319 15:23:15.538929 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" event={"ID":"d381c12a-2fec-406d-a9b9-e60b41ca9ba0","Type":"ContainerStarted","Data":"2d3ac5cdd9abcfec03c07b66de3055370b3372a2c6b54efb828b6e93a1bd3456"} Mar 19 15:23:15 crc kubenswrapper[4771]: I0319 15:23:15.539314 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" Mar 19 15:23:15 crc kubenswrapper[4771]: I0319 15:23:15.539329 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" event={"ID":"d381c12a-2fec-406d-a9b9-e60b41ca9ba0","Type":"ContainerStarted","Data":"cf315be354344c19cd7af122806ffaf3542311cf854ba033e3a4a676d03ac715"} Mar 19 15:23:15 crc kubenswrapper[4771]: I0319 15:23:15.561734 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" podStartSLOduration=1.561707191 podStartE2EDuration="1.561707191s" podCreationTimestamp="2026-03-19 15:23:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:23:15.559706579 +0000 UTC m=+454.788327781" watchObservedRunningTime="2026-03-19 15:23:15.561707191 +0000 UTC m=+454.790328433" Mar 19 15:23:23 crc kubenswrapper[4771]: I0319 15:23:23.028019 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:23:23 crc kubenswrapper[4771]: I0319 15:23:23.028507 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:23:28 crc kubenswrapper[4771]: I0319 15:23:28.544621 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:23:28 crc kubenswrapper[4771]: I0319 15:23:28.545220 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:23:28 crc kubenswrapper[4771]: I0319 15:23:28.546354 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:23:28 crc kubenswrapper[4771]: I0319 15:23:28.554030 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:23:28 crc kubenswrapper[4771]: I0319 15:23:28.810144 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 15:23:29 crc kubenswrapper[4771]: I0319 15:23:29.566075 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:23:29 crc kubenswrapper[4771]: I0319 15:23:29.566961 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:23:29 crc kubenswrapper[4771]: I0319 15:23:29.574583 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:23:29 crc kubenswrapper[4771]: I0319 15:23:29.574669 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:23:29 crc kubenswrapper[4771]: I0319 15:23:29.610977 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 15:23:29 crc kubenswrapper[4771]: I0319 15:23:29.648039 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"5ad8a07185c1e00b637e6427ac93139fad0623b3924c5294e5d4c37a15313849"} Mar 19 15:23:29 crc kubenswrapper[4771]: I0319 15:23:29.648149 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"139cef22df48ba008729d7cfb78025b4e83de5ad9ad96bcdc10d9a9d84f31d9b"} Mar 19 15:23:29 crc kubenswrapper[4771]: I0319 15:23:29.709523 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:23:30 crc kubenswrapper[4771]: W0319 15:23:30.063613 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-da3a914d62eadfd92581387194897fb3d20e2acb85885d93e3de30a04bbc8be3 WatchSource:0}: Error finding container da3a914d62eadfd92581387194897fb3d20e2acb85885d93e3de30a04bbc8be3: Status 404 returned error can't find the container with id da3a914d62eadfd92581387194897fb3d20e2acb85885d93e3de30a04bbc8be3 Mar 19 15:23:30 crc kubenswrapper[4771]: I0319 15:23:30.656898 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"f594c25e4c84e0b6e28c5cddfdc2156d4c628918ab7e22e0ff11e769c1fcd109"} Mar 19 15:23:30 crc kubenswrapper[4771]: I0319 15:23:30.657155 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"632c36a5c6765e0b70da57a9d84a324b94f16e554622dd8811689caa5b6bd3e3"} Mar 19 15:23:30 crc kubenswrapper[4771]: I0319 15:23:30.657373 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:23:30 crc kubenswrapper[4771]: I0319 15:23:30.662882 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f00afb84337b1cf6591328476651a77d962e6d01b9f1bdad95b10e43961be642"} Mar 19 15:23:30 crc kubenswrapper[4771]: I0319 15:23:30.662918 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"da3a914d62eadfd92581387194897fb3d20e2acb85885d93e3de30a04bbc8be3"} Mar 19 15:23:34 crc kubenswrapper[4771]: I0319 15:23:34.603774 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-tdkbr" Mar 19 15:23:34 crc kubenswrapper[4771]: I0319 15:23:34.699506 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hcvsv"] Mar 19 15:23:52 crc kubenswrapper[4771]: I0319 15:23:52.708211 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rlzb8"] Mar 19 15:23:52 crc kubenswrapper[4771]: I0319 15:23:52.709886 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rlzb8" podUID="b49408ed-5087-4cb2-b70e-391c32aad069" containerName="registry-server" containerID="cri-o://188cd9b03d1d760e8865718d9f7b0c60aa53c72f79601b7a5ac85307d89b2810" gracePeriod=30 Mar 19 15:23:52 crc kubenswrapper[4771]: I0319 15:23:52.717275 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4fmgq"] Mar 19 15:23:52 crc kubenswrapper[4771]: I0319 15:23:52.717849 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4fmgq" podUID="ae9495d8-bbe9-4f54-8c12-56b9f40530e1" containerName="registry-server" containerID="cri-o://944a773e6fac49c9e6a8d813447714711db4106a891c13821d1cc90387016f47" gracePeriod=30 Mar 19 15:23:52 crc kubenswrapper[4771]: I0319 15:23:52.726220 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7fqvf"] Mar 19 15:23:52 crc kubenswrapper[4771]: I0319 15:23:52.726470 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-7fqvf" podUID="b4eb7061-dde4-44f1-943a-219d2f4f5071" containerName="marketplace-operator" containerID="cri-o://03f9d4d9e2995f09f3acd59a40aa524141cd7d6e2f0fe7d530da3a0b07185ea6" gracePeriod=30 Mar 19 15:23:52 crc kubenswrapper[4771]: I0319 15:23:52.730489 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dmtnc"] Mar 19 15:23:52 crc kubenswrapper[4771]: I0319 15:23:52.730776 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dmtnc" podUID="65b4c9ce-e8af-4eca-abf5-08149432aaa5" containerName="registry-server" containerID="cri-o://615ba86dda50e86dfefc4ffe0568bc4bf2fee970daa80e02f4fa0d13f9ceb562" gracePeriod=30 Mar 19 15:23:52 crc kubenswrapper[4771]: I0319 15:23:52.747959 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tdfqz"] Mar 19 15:23:52 crc kubenswrapper[4771]: I0319 15:23:52.748335 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tdfqz" podUID="7f49e754-62f6-4f17-a6ed-fe5e3abe32b4" containerName="registry-server" containerID="cri-o://06e53e911b793778d34f78086db758bc28dc029bfd033570bcf77fc39703423c" gracePeriod=30 Mar 19 15:23:52 crc kubenswrapper[4771]: I0319 15:23:52.760087 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p8fgq"] Mar 19 15:23:52 crc kubenswrapper[4771]: I0319 15:23:52.760704 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p8fgq" Mar 19 15:23:52 crc kubenswrapper[4771]: I0319 15:23:52.791761 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p8fgq"] Mar 19 15:23:52 crc kubenswrapper[4771]: I0319 15:23:52.848540 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7x56\" (UniqueName: \"kubernetes.io/projected/676c8f4d-415b-4d2d-bc01-2b62ee6c32b5-kube-api-access-f7x56\") pod \"marketplace-operator-79b997595-p8fgq\" (UID: \"676c8f4d-415b-4d2d-bc01-2b62ee6c32b5\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8fgq" Mar 19 15:23:52 crc kubenswrapper[4771]: I0319 15:23:52.848860 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/676c8f4d-415b-4d2d-bc01-2b62ee6c32b5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p8fgq\" (UID: \"676c8f4d-415b-4d2d-bc01-2b62ee6c32b5\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8fgq" Mar 19 15:23:52 crc kubenswrapper[4771]: I0319 15:23:52.848888 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/676c8f4d-415b-4d2d-bc01-2b62ee6c32b5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p8fgq\" (UID: \"676c8f4d-415b-4d2d-bc01-2b62ee6c32b5\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8fgq" Mar 19 15:23:52 crc kubenswrapper[4771]: I0319 15:23:52.950470 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/676c8f4d-415b-4d2d-bc01-2b62ee6c32b5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p8fgq\" (UID: \"676c8f4d-415b-4d2d-bc01-2b62ee6c32b5\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8fgq" Mar 19 15:23:52 crc kubenswrapper[4771]: I0319 15:23:52.950531 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/676c8f4d-415b-4d2d-bc01-2b62ee6c32b5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p8fgq\" (UID: \"676c8f4d-415b-4d2d-bc01-2b62ee6c32b5\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8fgq" Mar 19 15:23:52 crc kubenswrapper[4771]: I0319 15:23:52.950607 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7x56\" (UniqueName: \"kubernetes.io/projected/676c8f4d-415b-4d2d-bc01-2b62ee6c32b5-kube-api-access-f7x56\") pod \"marketplace-operator-79b997595-p8fgq\" (UID: \"676c8f4d-415b-4d2d-bc01-2b62ee6c32b5\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8fgq" Mar 19 15:23:52 crc kubenswrapper[4771]: I0319 15:23:52.952517 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/676c8f4d-415b-4d2d-bc01-2b62ee6c32b5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p8fgq\" (UID: \"676c8f4d-415b-4d2d-bc01-2b62ee6c32b5\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8fgq" Mar 19 15:23:52 crc kubenswrapper[4771]: I0319 15:23:52.956425 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/676c8f4d-415b-4d2d-bc01-2b62ee6c32b5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p8fgq\" (UID: \"676c8f4d-415b-4d2d-bc01-2b62ee6c32b5\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8fgq" Mar 19 15:23:52 crc kubenswrapper[4771]: I0319 15:23:52.971831 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7x56\" (UniqueName: \"kubernetes.io/projected/676c8f4d-415b-4d2d-bc01-2b62ee6c32b5-kube-api-access-f7x56\") pod \"marketplace-operator-79b997595-p8fgq\" (UID: \"676c8f4d-415b-4d2d-bc01-2b62ee6c32b5\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8fgq" Mar 19 15:23:52 crc kubenswrapper[4771]: E0319 15:23:52.995059 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 188cd9b03d1d760e8865718d9f7b0c60aa53c72f79601b7a5ac85307d89b2810 is running failed: container process not found" containerID="188cd9b03d1d760e8865718d9f7b0c60aa53c72f79601b7a5ac85307d89b2810" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 15:23:52 crc kubenswrapper[4771]: E0319 15:23:52.995512 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 188cd9b03d1d760e8865718d9f7b0c60aa53c72f79601b7a5ac85307d89b2810 is running failed: container process not found" containerID="188cd9b03d1d760e8865718d9f7b0c60aa53c72f79601b7a5ac85307d89b2810" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 15:23:52 crc kubenswrapper[4771]: E0319 15:23:52.996347 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 188cd9b03d1d760e8865718d9f7b0c60aa53c72f79601b7a5ac85307d89b2810 is running failed: container process not found" containerID="188cd9b03d1d760e8865718d9f7b0c60aa53c72f79601b7a5ac85307d89b2810" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 15:23:52 crc kubenswrapper[4771]: E0319 15:23:52.996376 4771 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 188cd9b03d1d760e8865718d9f7b0c60aa53c72f79601b7a5ac85307d89b2810 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-rlzb8" podUID="b49408ed-5087-4cb2-b70e-391c32aad069" containerName="registry-server" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.028073 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.028172 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.080432 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p8fgq" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.157170 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rlzb8" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.166470 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7fqvf" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.171150 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dmtnc" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.187949 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tdfqz" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.188766 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4fmgq" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.255235 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b49408ed-5087-4cb2-b70e-391c32aad069-utilities\") pod \"b49408ed-5087-4cb2-b70e-391c32aad069\" (UID: \"b49408ed-5087-4cb2-b70e-391c32aad069\") " Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.255289 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f49e754-62f6-4f17-a6ed-fe5e3abe32b4-utilities\") pod \"7f49e754-62f6-4f17-a6ed-fe5e3abe32b4\" (UID: \"7f49e754-62f6-4f17-a6ed-fe5e3abe32b4\") " Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.255321 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65b4c9ce-e8af-4eca-abf5-08149432aaa5-utilities\") pod \"65b4c9ce-e8af-4eca-abf5-08149432aaa5\" (UID: \"65b4c9ce-e8af-4eca-abf5-08149432aaa5\") " Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.255352 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gprh\" (UniqueName: \"kubernetes.io/projected/b4eb7061-dde4-44f1-943a-219d2f4f5071-kube-api-access-7gprh\") pod \"b4eb7061-dde4-44f1-943a-219d2f4f5071\" (UID: \"b4eb7061-dde4-44f1-943a-219d2f4f5071\") " Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.255375 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae9495d8-bbe9-4f54-8c12-56b9f40530e1-catalog-content\") pod \"ae9495d8-bbe9-4f54-8c12-56b9f40530e1\" (UID: \"ae9495d8-bbe9-4f54-8c12-56b9f40530e1\") " Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.255418 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f49e754-62f6-4f17-a6ed-fe5e3abe32b4-catalog-content\") pod \"7f49e754-62f6-4f17-a6ed-fe5e3abe32b4\" (UID: \"7f49e754-62f6-4f17-a6ed-fe5e3abe32b4\") " Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.255456 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4eb7061-dde4-44f1-943a-219d2f4f5071-marketplace-trusted-ca\") pod \"b4eb7061-dde4-44f1-943a-219d2f4f5071\" (UID: \"b4eb7061-dde4-44f1-943a-219d2f4f5071\") " Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.255505 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qw49\" (UniqueName: \"kubernetes.io/projected/65b4c9ce-e8af-4eca-abf5-08149432aaa5-kube-api-access-5qw49\") pod \"65b4c9ce-e8af-4eca-abf5-08149432aaa5\" (UID: \"65b4c9ce-e8af-4eca-abf5-08149432aaa5\") " Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.255555 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b49408ed-5087-4cb2-b70e-391c32aad069-catalog-content\") pod \"b49408ed-5087-4cb2-b70e-391c32aad069\" (UID: \"b49408ed-5087-4cb2-b70e-391c32aad069\") " Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.255592 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl5fw\" (UniqueName: \"kubernetes.io/projected/ae9495d8-bbe9-4f54-8c12-56b9f40530e1-kube-api-access-dl5fw\") pod \"ae9495d8-bbe9-4f54-8c12-56b9f40530e1\" (UID: \"ae9495d8-bbe9-4f54-8c12-56b9f40530e1\") " Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.255624 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9mmh\" (UniqueName: \"kubernetes.io/projected/b49408ed-5087-4cb2-b70e-391c32aad069-kube-api-access-x9mmh\") pod \"b49408ed-5087-4cb2-b70e-391c32aad069\" (UID: \"b49408ed-5087-4cb2-b70e-391c32aad069\") " Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.255652 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65b4c9ce-e8af-4eca-abf5-08149432aaa5-catalog-content\") pod \"65b4c9ce-e8af-4eca-abf5-08149432aaa5\" (UID: \"65b4c9ce-e8af-4eca-abf5-08149432aaa5\") " Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.255691 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b4eb7061-dde4-44f1-943a-219d2f4f5071-marketplace-operator-metrics\") pod \"b4eb7061-dde4-44f1-943a-219d2f4f5071\" (UID: \"b4eb7061-dde4-44f1-943a-219d2f4f5071\") " Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.255731 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnv6j\" (UniqueName: \"kubernetes.io/projected/7f49e754-62f6-4f17-a6ed-fe5e3abe32b4-kube-api-access-pnv6j\") pod \"7f49e754-62f6-4f17-a6ed-fe5e3abe32b4\" (UID: \"7f49e754-62f6-4f17-a6ed-fe5e3abe32b4\") " Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.255760 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae9495d8-bbe9-4f54-8c12-56b9f40530e1-utilities\") pod \"ae9495d8-bbe9-4f54-8c12-56b9f40530e1\" (UID: \"ae9495d8-bbe9-4f54-8c12-56b9f40530e1\") " Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.256347 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65b4c9ce-e8af-4eca-abf5-08149432aaa5-utilities" (OuterVolumeSpecName: "utilities") pod "65b4c9ce-e8af-4eca-abf5-08149432aaa5" (UID: "65b4c9ce-e8af-4eca-abf5-08149432aaa5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.257066 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f49e754-62f6-4f17-a6ed-fe5e3abe32b4-utilities" (OuterVolumeSpecName: "utilities") pod "7f49e754-62f6-4f17-a6ed-fe5e3abe32b4" (UID: "7f49e754-62f6-4f17-a6ed-fe5e3abe32b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.257134 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b49408ed-5087-4cb2-b70e-391c32aad069-utilities" (OuterVolumeSpecName: "utilities") pod "b49408ed-5087-4cb2-b70e-391c32aad069" (UID: "b49408ed-5087-4cb2-b70e-391c32aad069"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.257917 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4eb7061-dde4-44f1-943a-219d2f4f5071-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b4eb7061-dde4-44f1-943a-219d2f4f5071" (UID: "b4eb7061-dde4-44f1-943a-219d2f4f5071"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.258568 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae9495d8-bbe9-4f54-8c12-56b9f40530e1-utilities" (OuterVolumeSpecName: "utilities") pod "ae9495d8-bbe9-4f54-8c12-56b9f40530e1" (UID: "ae9495d8-bbe9-4f54-8c12-56b9f40530e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.269601 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4eb7061-dde4-44f1-943a-219d2f4f5071-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b4eb7061-dde4-44f1-943a-219d2f4f5071" (UID: "b4eb7061-dde4-44f1-943a-219d2f4f5071"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.270343 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b49408ed-5087-4cb2-b70e-391c32aad069-kube-api-access-x9mmh" (OuterVolumeSpecName: "kube-api-access-x9mmh") pod "b49408ed-5087-4cb2-b70e-391c32aad069" (UID: "b49408ed-5087-4cb2-b70e-391c32aad069"). InnerVolumeSpecName "kube-api-access-x9mmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.270601 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f49e754-62f6-4f17-a6ed-fe5e3abe32b4-kube-api-access-pnv6j" (OuterVolumeSpecName: "kube-api-access-pnv6j") pod "7f49e754-62f6-4f17-a6ed-fe5e3abe32b4" (UID: "7f49e754-62f6-4f17-a6ed-fe5e3abe32b4"). InnerVolumeSpecName "kube-api-access-pnv6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.270633 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4eb7061-dde4-44f1-943a-219d2f4f5071-kube-api-access-7gprh" (OuterVolumeSpecName: "kube-api-access-7gprh") pod "b4eb7061-dde4-44f1-943a-219d2f4f5071" (UID: "b4eb7061-dde4-44f1-943a-219d2f4f5071"). InnerVolumeSpecName "kube-api-access-7gprh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.271444 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae9495d8-bbe9-4f54-8c12-56b9f40530e1-kube-api-access-dl5fw" (OuterVolumeSpecName: "kube-api-access-dl5fw") pod "ae9495d8-bbe9-4f54-8c12-56b9f40530e1" (UID: "ae9495d8-bbe9-4f54-8c12-56b9f40530e1"). InnerVolumeSpecName "kube-api-access-dl5fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.279737 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65b4c9ce-e8af-4eca-abf5-08149432aaa5-kube-api-access-5qw49" (OuterVolumeSpecName: "kube-api-access-5qw49") pod "65b4c9ce-e8af-4eca-abf5-08149432aaa5" (UID: "65b4c9ce-e8af-4eca-abf5-08149432aaa5"). InnerVolumeSpecName "kube-api-access-5qw49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.286614 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65b4c9ce-e8af-4eca-abf5-08149432aaa5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65b4c9ce-e8af-4eca-abf5-08149432aaa5" (UID: "65b4c9ce-e8af-4eca-abf5-08149432aaa5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.312181 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b49408ed-5087-4cb2-b70e-391c32aad069-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b49408ed-5087-4cb2-b70e-391c32aad069" (UID: "b49408ed-5087-4cb2-b70e-391c32aad069"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.320552 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae9495d8-bbe9-4f54-8c12-56b9f40530e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae9495d8-bbe9-4f54-8c12-56b9f40530e1" (UID: "ae9495d8-bbe9-4f54-8c12-56b9f40530e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.357088 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnv6j\" (UniqueName: \"kubernetes.io/projected/7f49e754-62f6-4f17-a6ed-fe5e3abe32b4-kube-api-access-pnv6j\") on node \"crc\" DevicePath \"\"" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.357331 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae9495d8-bbe9-4f54-8c12-56b9f40530e1-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.357402 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b49408ed-5087-4cb2-b70e-391c32aad069-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.357460 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f49e754-62f6-4f17-a6ed-fe5e3abe32b4-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.357523 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65b4c9ce-e8af-4eca-abf5-08149432aaa5-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.357585 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gprh\" (UniqueName: \"kubernetes.io/projected/b4eb7061-dde4-44f1-943a-219d2f4f5071-kube-api-access-7gprh\") on node \"crc\" DevicePath \"\"" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.357642 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae9495d8-bbe9-4f54-8c12-56b9f40530e1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.357707 4771 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4eb7061-dde4-44f1-943a-219d2f4f5071-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.357763 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qw49\" (UniqueName: \"kubernetes.io/projected/65b4c9ce-e8af-4eca-abf5-08149432aaa5-kube-api-access-5qw49\") on node \"crc\" DevicePath \"\"" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.357817 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b49408ed-5087-4cb2-b70e-391c32aad069-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.357877 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dl5fw\" (UniqueName: \"kubernetes.io/projected/ae9495d8-bbe9-4f54-8c12-56b9f40530e1-kube-api-access-dl5fw\") on node \"crc\" DevicePath \"\"" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.357939 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9mmh\" (UniqueName: \"kubernetes.io/projected/b49408ed-5087-4cb2-b70e-391c32aad069-kube-api-access-x9mmh\") on node \"crc\" DevicePath \"\"" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.358008 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65b4c9ce-e8af-4eca-abf5-08149432aaa5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.358075 4771 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b4eb7061-dde4-44f1-943a-219d2f4f5071-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.385583 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f49e754-62f6-4f17-a6ed-fe5e3abe32b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7f49e754-62f6-4f17-a6ed-fe5e3abe32b4" (UID: "7f49e754-62f6-4f17-a6ed-fe5e3abe32b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.459228 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f49e754-62f6-4f17-a6ed-fe5e3abe32b4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.499668 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p8fgq"] Mar 19 15:23:53 crc kubenswrapper[4771]: W0319 15:23:53.506313 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod676c8f4d_415b_4d2d_bc01_2b62ee6c32b5.slice/crio-cc2d6192bd0411276a97ddd8d29b800c0bd6948e4c1a70a7fbdf541b1d288405 WatchSource:0}: Error finding container cc2d6192bd0411276a97ddd8d29b800c0bd6948e4c1a70a7fbdf541b1d288405: Status 404 returned error can't find the container with id cc2d6192bd0411276a97ddd8d29b800c0bd6948e4c1a70a7fbdf541b1d288405 Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.822050 4771 generic.go:334] "Generic (PLEG): container finished" podID="7f49e754-62f6-4f17-a6ed-fe5e3abe32b4" containerID="06e53e911b793778d34f78086db758bc28dc029bfd033570bcf77fc39703423c" exitCode=0 Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.822147 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tdfqz" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.822148 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tdfqz" event={"ID":"7f49e754-62f6-4f17-a6ed-fe5e3abe32b4","Type":"ContainerDied","Data":"06e53e911b793778d34f78086db758bc28dc029bfd033570bcf77fc39703423c"} Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.823184 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tdfqz" event={"ID":"7f49e754-62f6-4f17-a6ed-fe5e3abe32b4","Type":"ContainerDied","Data":"00d07be24cd01501f2c49ca3c4cddc06b9772cc0fd4fa70fb6541055b211c326"} Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.823212 4771 scope.go:117] "RemoveContainer" containerID="06e53e911b793778d34f78086db758bc28dc029bfd033570bcf77fc39703423c" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.827697 4771 generic.go:334] "Generic (PLEG): container finished" podID="b49408ed-5087-4cb2-b70e-391c32aad069" containerID="188cd9b03d1d760e8865718d9f7b0c60aa53c72f79601b7a5ac85307d89b2810" exitCode=0 Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.827750 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlzb8" event={"ID":"b49408ed-5087-4cb2-b70e-391c32aad069","Type":"ContainerDied","Data":"188cd9b03d1d760e8865718d9f7b0c60aa53c72f79601b7a5ac85307d89b2810"} Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.827772 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlzb8" event={"ID":"b49408ed-5087-4cb2-b70e-391c32aad069","Type":"ContainerDied","Data":"c01c5999bed964b7f7e8295008a402360e7143c4f7b39d330d5e3982c5d2b86a"} Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.827852 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rlzb8" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.832936 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p8fgq" event={"ID":"676c8f4d-415b-4d2d-bc01-2b62ee6c32b5","Type":"ContainerStarted","Data":"4aa586c64ab8fd693b6afb200d9844d98f9d4de10887742829fc5d6483acdc24"} Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.832977 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p8fgq" event={"ID":"676c8f4d-415b-4d2d-bc01-2b62ee6c32b5","Type":"ContainerStarted","Data":"cc2d6192bd0411276a97ddd8d29b800c0bd6948e4c1a70a7fbdf541b1d288405"} Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.833411 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-p8fgq" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.835181 4771 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-p8fgq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.72:8080/healthz\": dial tcp 10.217.0.72:8080: connect: connection refused" start-of-body= Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.835229 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-p8fgq" podUID="676c8f4d-415b-4d2d-bc01-2b62ee6c32b5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.72:8080/healthz\": dial tcp 10.217.0.72:8080: connect: connection refused" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.853150 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tdfqz"] Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.854269 4771 scope.go:117] "RemoveContainer" containerID="777aa2e708d91c6d7a2c38559877bfefe31f239aed167c6666ebff766402d8dc" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.855241 4771 generic.go:334] "Generic (PLEG): container finished" podID="b4eb7061-dde4-44f1-943a-219d2f4f5071" containerID="03f9d4d9e2995f09f3acd59a40aa524141cd7d6e2f0fe7d530da3a0b07185ea6" exitCode=0 Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.855325 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7fqvf" event={"ID":"b4eb7061-dde4-44f1-943a-219d2f4f5071","Type":"ContainerDied","Data":"03f9d4d9e2995f09f3acd59a40aa524141cd7d6e2f0fe7d530da3a0b07185ea6"} Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.855358 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7fqvf" event={"ID":"b4eb7061-dde4-44f1-943a-219d2f4f5071","Type":"ContainerDied","Data":"6062d69dc80ba1a7481eed0919c864d812434d26b533306e215bf893c3aa329c"} Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.855427 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7fqvf" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.857793 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tdfqz"] Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.862167 4771 generic.go:334] "Generic (PLEG): container finished" podID="ae9495d8-bbe9-4f54-8c12-56b9f40530e1" containerID="944a773e6fac49c9e6a8d813447714711db4106a891c13821d1cc90387016f47" exitCode=0 Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.862231 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4fmgq" event={"ID":"ae9495d8-bbe9-4f54-8c12-56b9f40530e1","Type":"ContainerDied","Data":"944a773e6fac49c9e6a8d813447714711db4106a891c13821d1cc90387016f47"} Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.862259 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4fmgq" event={"ID":"ae9495d8-bbe9-4f54-8c12-56b9f40530e1","Type":"ContainerDied","Data":"c279eaaf44c6bf82eb37a43943f19c5c5b4c332b02c97ba959f4b9389fc5c5d4"} Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.862347 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4fmgq" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.868339 4771 generic.go:334] "Generic (PLEG): container finished" podID="65b4c9ce-e8af-4eca-abf5-08149432aaa5" containerID="615ba86dda50e86dfefc4ffe0568bc4bf2fee970daa80e02f4fa0d13f9ceb562" exitCode=0 Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.868392 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmtnc" event={"ID":"65b4c9ce-e8af-4eca-abf5-08149432aaa5","Type":"ContainerDied","Data":"615ba86dda50e86dfefc4ffe0568bc4bf2fee970daa80e02f4fa0d13f9ceb562"} Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.868432 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmtnc" event={"ID":"65b4c9ce-e8af-4eca-abf5-08149432aaa5","Type":"ContainerDied","Data":"515d6284b025565bf857a3b989c2bf7975aea54ea553ab963038b1dee40a2a2c"} Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.868514 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dmtnc" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.869237 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-p8fgq" podStartSLOduration=1.869217014 podStartE2EDuration="1.869217014s" podCreationTimestamp="2026-03-19 15:23:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:23:53.865076088 +0000 UTC m=+493.093697320" watchObservedRunningTime="2026-03-19 15:23:53.869217014 +0000 UTC m=+493.097838226" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.896624 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rlzb8"] Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.901798 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rlzb8"] Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.905114 4771 scope.go:117] "RemoveContainer" containerID="c58002d121f59cc75a38d441a65c463656e0e0a6b35ceee40ba6536d90a70b39" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.922936 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4fmgq"] Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.923778 4771 scope.go:117] "RemoveContainer" containerID="06e53e911b793778d34f78086db758bc28dc029bfd033570bcf77fc39703423c" Mar 19 15:23:53 crc kubenswrapper[4771]: E0319 15:23:53.924496 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06e53e911b793778d34f78086db758bc28dc029bfd033570bcf77fc39703423c\": container with ID starting with 06e53e911b793778d34f78086db758bc28dc029bfd033570bcf77fc39703423c not found: ID does not exist" containerID="06e53e911b793778d34f78086db758bc28dc029bfd033570bcf77fc39703423c" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.924572 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06e53e911b793778d34f78086db758bc28dc029bfd033570bcf77fc39703423c"} err="failed to get container status \"06e53e911b793778d34f78086db758bc28dc029bfd033570bcf77fc39703423c\": rpc error: code = NotFound desc = could not find container \"06e53e911b793778d34f78086db758bc28dc029bfd033570bcf77fc39703423c\": container with ID starting with 06e53e911b793778d34f78086db758bc28dc029bfd033570bcf77fc39703423c not found: ID does not exist" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.924605 4771 scope.go:117] "RemoveContainer" containerID="777aa2e708d91c6d7a2c38559877bfefe31f239aed167c6666ebff766402d8dc" Mar 19 15:23:53 crc kubenswrapper[4771]: E0319 15:23:53.925033 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"777aa2e708d91c6d7a2c38559877bfefe31f239aed167c6666ebff766402d8dc\": container with ID starting with 777aa2e708d91c6d7a2c38559877bfefe31f239aed167c6666ebff766402d8dc not found: ID does not exist" containerID="777aa2e708d91c6d7a2c38559877bfefe31f239aed167c6666ebff766402d8dc" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.925062 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"777aa2e708d91c6d7a2c38559877bfefe31f239aed167c6666ebff766402d8dc"} err="failed to get container status \"777aa2e708d91c6d7a2c38559877bfefe31f239aed167c6666ebff766402d8dc\": rpc error: code = NotFound desc = could not find container \"777aa2e708d91c6d7a2c38559877bfefe31f239aed167c6666ebff766402d8dc\": container with ID starting with 777aa2e708d91c6d7a2c38559877bfefe31f239aed167c6666ebff766402d8dc not found: ID does not exist" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.925104 4771 scope.go:117] "RemoveContainer" containerID="c58002d121f59cc75a38d441a65c463656e0e0a6b35ceee40ba6536d90a70b39" Mar 19 15:23:53 crc kubenswrapper[4771]: E0319 15:23:53.925500 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c58002d121f59cc75a38d441a65c463656e0e0a6b35ceee40ba6536d90a70b39\": container with ID starting with c58002d121f59cc75a38d441a65c463656e0e0a6b35ceee40ba6536d90a70b39 not found: ID does not exist" containerID="c58002d121f59cc75a38d441a65c463656e0e0a6b35ceee40ba6536d90a70b39" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.925582 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c58002d121f59cc75a38d441a65c463656e0e0a6b35ceee40ba6536d90a70b39"} err="failed to get container status \"c58002d121f59cc75a38d441a65c463656e0e0a6b35ceee40ba6536d90a70b39\": rpc error: code = NotFound desc = could not find container \"c58002d121f59cc75a38d441a65c463656e0e0a6b35ceee40ba6536d90a70b39\": container with ID starting with c58002d121f59cc75a38d441a65c463656e0e0a6b35ceee40ba6536d90a70b39 not found: ID does not exist" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.925645 4771 scope.go:117] "RemoveContainer" containerID="188cd9b03d1d760e8865718d9f7b0c60aa53c72f79601b7a5ac85307d89b2810" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.935535 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4fmgq"] Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.941669 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dmtnc"] Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.946381 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dmtnc"] Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.948067 4771 scope.go:117] "RemoveContainer" containerID="fa3e2911a377705e179236637ff3e39ad003b3198a9a168a7c42cd1169c41f38" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.950028 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7fqvf"] Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.952629 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7fqvf"] Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.964171 4771 scope.go:117] "RemoveContainer" containerID="34d520fb2800a2983838098311054ab96e10ffcc3bd3d8b8211877ab06849401" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.982126 4771 scope.go:117] "RemoveContainer" containerID="188cd9b03d1d760e8865718d9f7b0c60aa53c72f79601b7a5ac85307d89b2810" Mar 19 15:23:53 crc kubenswrapper[4771]: E0319 15:23:53.982875 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"188cd9b03d1d760e8865718d9f7b0c60aa53c72f79601b7a5ac85307d89b2810\": container with ID starting with 188cd9b03d1d760e8865718d9f7b0c60aa53c72f79601b7a5ac85307d89b2810 not found: ID does not exist" containerID="188cd9b03d1d760e8865718d9f7b0c60aa53c72f79601b7a5ac85307d89b2810" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.982941 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"188cd9b03d1d760e8865718d9f7b0c60aa53c72f79601b7a5ac85307d89b2810"} err="failed to get container status \"188cd9b03d1d760e8865718d9f7b0c60aa53c72f79601b7a5ac85307d89b2810\": rpc error: code = NotFound desc = could not find container \"188cd9b03d1d760e8865718d9f7b0c60aa53c72f79601b7a5ac85307d89b2810\": container with ID starting with 188cd9b03d1d760e8865718d9f7b0c60aa53c72f79601b7a5ac85307d89b2810 not found: ID does not exist" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.982974 4771 scope.go:117] "RemoveContainer" containerID="fa3e2911a377705e179236637ff3e39ad003b3198a9a168a7c42cd1169c41f38" Mar 19 15:23:53 crc kubenswrapper[4771]: E0319 15:23:53.983469 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa3e2911a377705e179236637ff3e39ad003b3198a9a168a7c42cd1169c41f38\": container with ID starting with fa3e2911a377705e179236637ff3e39ad003b3198a9a168a7c42cd1169c41f38 not found: ID does not exist" containerID="fa3e2911a377705e179236637ff3e39ad003b3198a9a168a7c42cd1169c41f38" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.983507 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa3e2911a377705e179236637ff3e39ad003b3198a9a168a7c42cd1169c41f38"} err="failed to get container status \"fa3e2911a377705e179236637ff3e39ad003b3198a9a168a7c42cd1169c41f38\": rpc error: code = NotFound desc = could not find container \"fa3e2911a377705e179236637ff3e39ad003b3198a9a168a7c42cd1169c41f38\": container with ID starting with fa3e2911a377705e179236637ff3e39ad003b3198a9a168a7c42cd1169c41f38 not found: ID does not exist" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.983541 4771 scope.go:117] "RemoveContainer" containerID="34d520fb2800a2983838098311054ab96e10ffcc3bd3d8b8211877ab06849401" Mar 19 15:23:53 crc kubenswrapper[4771]: E0319 15:23:53.984109 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34d520fb2800a2983838098311054ab96e10ffcc3bd3d8b8211877ab06849401\": container with ID starting with 34d520fb2800a2983838098311054ab96e10ffcc3bd3d8b8211877ab06849401 not found: ID does not exist" containerID="34d520fb2800a2983838098311054ab96e10ffcc3bd3d8b8211877ab06849401" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.984141 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34d520fb2800a2983838098311054ab96e10ffcc3bd3d8b8211877ab06849401"} err="failed to get container status \"34d520fb2800a2983838098311054ab96e10ffcc3bd3d8b8211877ab06849401\": rpc error: code = NotFound desc = could not find container \"34d520fb2800a2983838098311054ab96e10ffcc3bd3d8b8211877ab06849401\": container with ID starting with 34d520fb2800a2983838098311054ab96e10ffcc3bd3d8b8211877ab06849401 not found: ID does not exist" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.984182 4771 scope.go:117] "RemoveContainer" containerID="03f9d4d9e2995f09f3acd59a40aa524141cd7d6e2f0fe7d530da3a0b07185ea6" Mar 19 15:23:53 crc kubenswrapper[4771]: I0319 15:23:53.997463 4771 scope.go:117] "RemoveContainer" containerID="e5431da28efcb15adba752f5d58f4db6fdc61a9c5d1fde67d2e0b8a038714652" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.019125 4771 scope.go:117] "RemoveContainer" containerID="03f9d4d9e2995f09f3acd59a40aa524141cd7d6e2f0fe7d530da3a0b07185ea6" Mar 19 15:23:54 crc kubenswrapper[4771]: E0319 15:23:54.019527 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03f9d4d9e2995f09f3acd59a40aa524141cd7d6e2f0fe7d530da3a0b07185ea6\": container with ID starting with 03f9d4d9e2995f09f3acd59a40aa524141cd7d6e2f0fe7d530da3a0b07185ea6 not found: ID does not exist" containerID="03f9d4d9e2995f09f3acd59a40aa524141cd7d6e2f0fe7d530da3a0b07185ea6" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.019569 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03f9d4d9e2995f09f3acd59a40aa524141cd7d6e2f0fe7d530da3a0b07185ea6"} err="failed to get container status \"03f9d4d9e2995f09f3acd59a40aa524141cd7d6e2f0fe7d530da3a0b07185ea6\": rpc error: code = NotFound desc = could not find container \"03f9d4d9e2995f09f3acd59a40aa524141cd7d6e2f0fe7d530da3a0b07185ea6\": container with ID starting with 03f9d4d9e2995f09f3acd59a40aa524141cd7d6e2f0fe7d530da3a0b07185ea6 not found: ID does not exist" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.019609 4771 scope.go:117] "RemoveContainer" containerID="e5431da28efcb15adba752f5d58f4db6fdc61a9c5d1fde67d2e0b8a038714652" Mar 19 15:23:54 crc kubenswrapper[4771]: E0319 15:23:54.019995 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5431da28efcb15adba752f5d58f4db6fdc61a9c5d1fde67d2e0b8a038714652\": container with ID starting with e5431da28efcb15adba752f5d58f4db6fdc61a9c5d1fde67d2e0b8a038714652 not found: ID does not exist" containerID="e5431da28efcb15adba752f5d58f4db6fdc61a9c5d1fde67d2e0b8a038714652" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.020017 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5431da28efcb15adba752f5d58f4db6fdc61a9c5d1fde67d2e0b8a038714652"} err="failed to get container status \"e5431da28efcb15adba752f5d58f4db6fdc61a9c5d1fde67d2e0b8a038714652\": rpc error: code = NotFound desc = could not find container \"e5431da28efcb15adba752f5d58f4db6fdc61a9c5d1fde67d2e0b8a038714652\": container with ID starting with e5431da28efcb15adba752f5d58f4db6fdc61a9c5d1fde67d2e0b8a038714652 not found: ID does not exist" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.020030 4771 scope.go:117] "RemoveContainer" containerID="944a773e6fac49c9e6a8d813447714711db4106a891c13821d1cc90387016f47" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.031905 4771 scope.go:117] "RemoveContainer" containerID="ed2f2c15294c7f6352d925fb098f33bfbef8ffb328d72ce96a1360db33bc1f80" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.045700 4771 scope.go:117] "RemoveContainer" containerID="d3392fcd9bb5d55d9e1d9463a62d71d660adfb386daffc78ed87972fb4ffc8b4" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.060921 4771 scope.go:117] "RemoveContainer" containerID="944a773e6fac49c9e6a8d813447714711db4106a891c13821d1cc90387016f47" Mar 19 15:23:54 crc kubenswrapper[4771]: E0319 15:23:54.061359 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"944a773e6fac49c9e6a8d813447714711db4106a891c13821d1cc90387016f47\": container with ID starting with 944a773e6fac49c9e6a8d813447714711db4106a891c13821d1cc90387016f47 not found: ID does not exist" containerID="944a773e6fac49c9e6a8d813447714711db4106a891c13821d1cc90387016f47" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.061405 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"944a773e6fac49c9e6a8d813447714711db4106a891c13821d1cc90387016f47"} err="failed to get container status \"944a773e6fac49c9e6a8d813447714711db4106a891c13821d1cc90387016f47\": rpc error: code = NotFound desc = could not find container \"944a773e6fac49c9e6a8d813447714711db4106a891c13821d1cc90387016f47\": container with ID starting with 944a773e6fac49c9e6a8d813447714711db4106a891c13821d1cc90387016f47 not found: ID does not exist" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.061432 4771 scope.go:117] "RemoveContainer" containerID="ed2f2c15294c7f6352d925fb098f33bfbef8ffb328d72ce96a1360db33bc1f80" Mar 19 15:23:54 crc kubenswrapper[4771]: E0319 15:23:54.061718 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed2f2c15294c7f6352d925fb098f33bfbef8ffb328d72ce96a1360db33bc1f80\": container with ID starting with ed2f2c15294c7f6352d925fb098f33bfbef8ffb328d72ce96a1360db33bc1f80 not found: ID does not exist" containerID="ed2f2c15294c7f6352d925fb098f33bfbef8ffb328d72ce96a1360db33bc1f80" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.061763 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed2f2c15294c7f6352d925fb098f33bfbef8ffb328d72ce96a1360db33bc1f80"} err="failed to get container status \"ed2f2c15294c7f6352d925fb098f33bfbef8ffb328d72ce96a1360db33bc1f80\": rpc error: code = NotFound desc = could not find container \"ed2f2c15294c7f6352d925fb098f33bfbef8ffb328d72ce96a1360db33bc1f80\": container with ID starting with ed2f2c15294c7f6352d925fb098f33bfbef8ffb328d72ce96a1360db33bc1f80 not found: ID does not exist" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.061784 4771 scope.go:117] "RemoveContainer" containerID="d3392fcd9bb5d55d9e1d9463a62d71d660adfb386daffc78ed87972fb4ffc8b4" Mar 19 15:23:54 crc kubenswrapper[4771]: E0319 15:23:54.062047 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3392fcd9bb5d55d9e1d9463a62d71d660adfb386daffc78ed87972fb4ffc8b4\": container with ID starting with d3392fcd9bb5d55d9e1d9463a62d71d660adfb386daffc78ed87972fb4ffc8b4 not found: ID does not exist" containerID="d3392fcd9bb5d55d9e1d9463a62d71d660adfb386daffc78ed87972fb4ffc8b4" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.062090 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3392fcd9bb5d55d9e1d9463a62d71d660adfb386daffc78ed87972fb4ffc8b4"} err="failed to get container status \"d3392fcd9bb5d55d9e1d9463a62d71d660adfb386daffc78ed87972fb4ffc8b4\": rpc error: code = NotFound desc = could not find container \"d3392fcd9bb5d55d9e1d9463a62d71d660adfb386daffc78ed87972fb4ffc8b4\": container with ID starting with d3392fcd9bb5d55d9e1d9463a62d71d660adfb386daffc78ed87972fb4ffc8b4 not found: ID does not exist" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.062119 4771 scope.go:117] "RemoveContainer" containerID="615ba86dda50e86dfefc4ffe0568bc4bf2fee970daa80e02f4fa0d13f9ceb562" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.075454 4771 scope.go:117] "RemoveContainer" containerID="ad955d7eb09d1b517073fce8e813d1e788d8d6f608a543b930b4bffe4d1c99ff" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.090230 4771 scope.go:117] "RemoveContainer" containerID="2119078b850e192f802ad2becfc2c3a2bf81cb57ca9e20f02765f3702e5ae26a" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.101719 4771 scope.go:117] "RemoveContainer" containerID="615ba86dda50e86dfefc4ffe0568bc4bf2fee970daa80e02f4fa0d13f9ceb562" Mar 19 15:23:54 crc kubenswrapper[4771]: E0319 15:23:54.102126 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"615ba86dda50e86dfefc4ffe0568bc4bf2fee970daa80e02f4fa0d13f9ceb562\": container with ID starting with 615ba86dda50e86dfefc4ffe0568bc4bf2fee970daa80e02f4fa0d13f9ceb562 not found: ID does not exist" containerID="615ba86dda50e86dfefc4ffe0568bc4bf2fee970daa80e02f4fa0d13f9ceb562" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.102162 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"615ba86dda50e86dfefc4ffe0568bc4bf2fee970daa80e02f4fa0d13f9ceb562"} err="failed to get container status \"615ba86dda50e86dfefc4ffe0568bc4bf2fee970daa80e02f4fa0d13f9ceb562\": rpc error: code = NotFound desc = could not find container \"615ba86dda50e86dfefc4ffe0568bc4bf2fee970daa80e02f4fa0d13f9ceb562\": container with ID starting with 615ba86dda50e86dfefc4ffe0568bc4bf2fee970daa80e02f4fa0d13f9ceb562 not found: ID does not exist" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.102186 4771 scope.go:117] "RemoveContainer" containerID="ad955d7eb09d1b517073fce8e813d1e788d8d6f608a543b930b4bffe4d1c99ff" Mar 19 15:23:54 crc kubenswrapper[4771]: E0319 15:23:54.102569 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad955d7eb09d1b517073fce8e813d1e788d8d6f608a543b930b4bffe4d1c99ff\": container with ID starting with ad955d7eb09d1b517073fce8e813d1e788d8d6f608a543b930b4bffe4d1c99ff not found: ID does not exist" containerID="ad955d7eb09d1b517073fce8e813d1e788d8d6f608a543b930b4bffe4d1c99ff" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.102609 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad955d7eb09d1b517073fce8e813d1e788d8d6f608a543b930b4bffe4d1c99ff"} err="failed to get container status \"ad955d7eb09d1b517073fce8e813d1e788d8d6f608a543b930b4bffe4d1c99ff\": rpc error: code = NotFound desc = could not find container \"ad955d7eb09d1b517073fce8e813d1e788d8d6f608a543b930b4bffe4d1c99ff\": container with ID starting with ad955d7eb09d1b517073fce8e813d1e788d8d6f608a543b930b4bffe4d1c99ff not found: ID does not exist" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.102657 4771 scope.go:117] "RemoveContainer" containerID="2119078b850e192f802ad2becfc2c3a2bf81cb57ca9e20f02765f3702e5ae26a" Mar 19 15:23:54 crc kubenswrapper[4771]: E0319 15:23:54.103072 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2119078b850e192f802ad2becfc2c3a2bf81cb57ca9e20f02765f3702e5ae26a\": container with ID starting with 2119078b850e192f802ad2becfc2c3a2bf81cb57ca9e20f02765f3702e5ae26a not found: ID does not exist" containerID="2119078b850e192f802ad2becfc2c3a2bf81cb57ca9e20f02765f3702e5ae26a" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.103099 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2119078b850e192f802ad2becfc2c3a2bf81cb57ca9e20f02765f3702e5ae26a"} err="failed to get container status \"2119078b850e192f802ad2becfc2c3a2bf81cb57ca9e20f02765f3702e5ae26a\": rpc error: code = NotFound desc = could not find container \"2119078b850e192f802ad2becfc2c3a2bf81cb57ca9e20f02765f3702e5ae26a\": container with ID starting with 2119078b850e192f802ad2becfc2c3a2bf81cb57ca9e20f02765f3702e5ae26a not found: ID does not exist" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.881533 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-p8fgq" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.935768 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7vgqh"] Mar 19 15:23:54 crc kubenswrapper[4771]: E0319 15:23:54.936276 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65b4c9ce-e8af-4eca-abf5-08149432aaa5" containerName="extract-utilities" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.936300 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b4c9ce-e8af-4eca-abf5-08149432aaa5" containerName="extract-utilities" Mar 19 15:23:54 crc kubenswrapper[4771]: E0319 15:23:54.936350 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b49408ed-5087-4cb2-b70e-391c32aad069" containerName="extract-content" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.936361 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b49408ed-5087-4cb2-b70e-391c32aad069" containerName="extract-content" Mar 19 15:23:54 crc kubenswrapper[4771]: E0319 15:23:54.936434 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4eb7061-dde4-44f1-943a-219d2f4f5071" containerName="marketplace-operator" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.936445 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4eb7061-dde4-44f1-943a-219d2f4f5071" containerName="marketplace-operator" Mar 19 15:23:54 crc kubenswrapper[4771]: E0319 15:23:54.936455 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae9495d8-bbe9-4f54-8c12-56b9f40530e1" containerName="extract-utilities" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.936463 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae9495d8-bbe9-4f54-8c12-56b9f40530e1" containerName="extract-utilities" Mar 19 15:23:54 crc kubenswrapper[4771]: E0319 15:23:54.936476 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f49e754-62f6-4f17-a6ed-fe5e3abe32b4" containerName="extract-content" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.936484 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f49e754-62f6-4f17-a6ed-fe5e3abe32b4" containerName="extract-content" Mar 19 15:23:54 crc kubenswrapper[4771]: E0319 15:23:54.936531 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65b4c9ce-e8af-4eca-abf5-08149432aaa5" containerName="extract-content" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.936539 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b4c9ce-e8af-4eca-abf5-08149432aaa5" containerName="extract-content" Mar 19 15:23:54 crc kubenswrapper[4771]: E0319 15:23:54.936551 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f49e754-62f6-4f17-a6ed-fe5e3abe32b4" containerName="extract-utilities" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.936558 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f49e754-62f6-4f17-a6ed-fe5e3abe32b4" containerName="extract-utilities" Mar 19 15:23:54 crc kubenswrapper[4771]: E0319 15:23:54.936603 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f49e754-62f6-4f17-a6ed-fe5e3abe32b4" containerName="registry-server" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.936612 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f49e754-62f6-4f17-a6ed-fe5e3abe32b4" containerName="registry-server" Mar 19 15:23:54 crc kubenswrapper[4771]: E0319 15:23:54.936625 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65b4c9ce-e8af-4eca-abf5-08149432aaa5" containerName="registry-server" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.936634 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b4c9ce-e8af-4eca-abf5-08149432aaa5" containerName="registry-server" Mar 19 15:23:54 crc kubenswrapper[4771]: E0319 15:23:54.936644 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b49408ed-5087-4cb2-b70e-391c32aad069" containerName="extract-utilities" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.936653 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b49408ed-5087-4cb2-b70e-391c32aad069" containerName="extract-utilities" Mar 19 15:23:54 crc kubenswrapper[4771]: E0319 15:23:54.936697 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b49408ed-5087-4cb2-b70e-391c32aad069" containerName="registry-server" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.936707 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b49408ed-5087-4cb2-b70e-391c32aad069" containerName="registry-server" Mar 19 15:23:54 crc kubenswrapper[4771]: E0319 15:23:54.936718 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae9495d8-bbe9-4f54-8c12-56b9f40530e1" containerName="registry-server" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.936726 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae9495d8-bbe9-4f54-8c12-56b9f40530e1" containerName="registry-server" Mar 19 15:23:54 crc kubenswrapper[4771]: E0319 15:23:54.936738 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae9495d8-bbe9-4f54-8c12-56b9f40530e1" containerName="extract-content" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.936779 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae9495d8-bbe9-4f54-8c12-56b9f40530e1" containerName="extract-content" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.937040 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b49408ed-5087-4cb2-b70e-391c32aad069" containerName="registry-server" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.937058 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="65b4c9ce-e8af-4eca-abf5-08149432aaa5" containerName="registry-server" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.937072 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4eb7061-dde4-44f1-943a-219d2f4f5071" containerName="marketplace-operator" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.937119 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4eb7061-dde4-44f1-943a-219d2f4f5071" containerName="marketplace-operator" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.937133 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae9495d8-bbe9-4f54-8c12-56b9f40530e1" containerName="registry-server" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.937143 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f49e754-62f6-4f17-a6ed-fe5e3abe32b4" containerName="registry-server" Mar 19 15:23:54 crc kubenswrapper[4771]: E0319 15:23:54.937370 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4eb7061-dde4-44f1-943a-219d2f4f5071" containerName="marketplace-operator" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.937387 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4eb7061-dde4-44f1-943a-219d2f4f5071" containerName="marketplace-operator" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.938717 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7vgqh"] Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.938830 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7vgqh" Mar 19 15:23:54 crc kubenswrapper[4771]: I0319 15:23:54.942093 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.093569 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce222397-3f74-43e5-85d9-39ed2aa02daf-catalog-content\") pod \"redhat-marketplace-7vgqh\" (UID: \"ce222397-3f74-43e5-85d9-39ed2aa02daf\") " pod="openshift-marketplace/redhat-marketplace-7vgqh" Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.093864 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4jjb\" (UniqueName: \"kubernetes.io/projected/ce222397-3f74-43e5-85d9-39ed2aa02daf-kube-api-access-m4jjb\") pod \"redhat-marketplace-7vgqh\" (UID: \"ce222397-3f74-43e5-85d9-39ed2aa02daf\") " pod="openshift-marketplace/redhat-marketplace-7vgqh" Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.093894 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce222397-3f74-43e5-85d9-39ed2aa02daf-utilities\") pod \"redhat-marketplace-7vgqh\" (UID: \"ce222397-3f74-43e5-85d9-39ed2aa02daf\") " pod="openshift-marketplace/redhat-marketplace-7vgqh" Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.114278 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s7zdn"] Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.115167 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s7zdn" Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.116787 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.127973 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s7zdn"] Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.194895 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xvvh\" (UniqueName: \"kubernetes.io/projected/8de902f3-47bf-470f-8d12-1d5920226652-kube-api-access-4xvvh\") pod \"certified-operators-s7zdn\" (UID: \"8de902f3-47bf-470f-8d12-1d5920226652\") " pod="openshift-marketplace/certified-operators-s7zdn" Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.195032 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8de902f3-47bf-470f-8d12-1d5920226652-catalog-content\") pod \"certified-operators-s7zdn\" (UID: \"8de902f3-47bf-470f-8d12-1d5920226652\") " pod="openshift-marketplace/certified-operators-s7zdn" Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.195107 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce222397-3f74-43e5-85d9-39ed2aa02daf-utilities\") pod \"redhat-marketplace-7vgqh\" (UID: \"ce222397-3f74-43e5-85d9-39ed2aa02daf\") " pod="openshift-marketplace/redhat-marketplace-7vgqh" Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.195203 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce222397-3f74-43e5-85d9-39ed2aa02daf-catalog-content\") pod \"redhat-marketplace-7vgqh\" (UID: \"ce222397-3f74-43e5-85d9-39ed2aa02daf\") " pod="openshift-marketplace/redhat-marketplace-7vgqh" Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.195295 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8de902f3-47bf-470f-8d12-1d5920226652-utilities\") pod \"certified-operators-s7zdn\" (UID: \"8de902f3-47bf-470f-8d12-1d5920226652\") " pod="openshift-marketplace/certified-operators-s7zdn" Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.195413 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4jjb\" (UniqueName: \"kubernetes.io/projected/ce222397-3f74-43e5-85d9-39ed2aa02daf-kube-api-access-m4jjb\") pod \"redhat-marketplace-7vgqh\" (UID: \"ce222397-3f74-43e5-85d9-39ed2aa02daf\") " pod="openshift-marketplace/redhat-marketplace-7vgqh" Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.195623 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce222397-3f74-43e5-85d9-39ed2aa02daf-utilities\") pod \"redhat-marketplace-7vgqh\" (UID: \"ce222397-3f74-43e5-85d9-39ed2aa02daf\") " pod="openshift-marketplace/redhat-marketplace-7vgqh" Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.195813 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce222397-3f74-43e5-85d9-39ed2aa02daf-catalog-content\") pod \"redhat-marketplace-7vgqh\" (UID: \"ce222397-3f74-43e5-85d9-39ed2aa02daf\") " pod="openshift-marketplace/redhat-marketplace-7vgqh" Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.213862 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4jjb\" (UniqueName: \"kubernetes.io/projected/ce222397-3f74-43e5-85d9-39ed2aa02daf-kube-api-access-m4jjb\") pod \"redhat-marketplace-7vgqh\" (UID: \"ce222397-3f74-43e5-85d9-39ed2aa02daf\") " pod="openshift-marketplace/redhat-marketplace-7vgqh" Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.263771 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7vgqh" Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.297613 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xvvh\" (UniqueName: \"kubernetes.io/projected/8de902f3-47bf-470f-8d12-1d5920226652-kube-api-access-4xvvh\") pod \"certified-operators-s7zdn\" (UID: \"8de902f3-47bf-470f-8d12-1d5920226652\") " pod="openshift-marketplace/certified-operators-s7zdn" Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.297933 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8de902f3-47bf-470f-8d12-1d5920226652-catalog-content\") pod \"certified-operators-s7zdn\" (UID: \"8de902f3-47bf-470f-8d12-1d5920226652\") " pod="openshift-marketplace/certified-operators-s7zdn" Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.298193 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8de902f3-47bf-470f-8d12-1d5920226652-utilities\") pod \"certified-operators-s7zdn\" (UID: \"8de902f3-47bf-470f-8d12-1d5920226652\") " pod="openshift-marketplace/certified-operators-s7zdn" Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.299053 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8de902f3-47bf-470f-8d12-1d5920226652-catalog-content\") pod \"certified-operators-s7zdn\" (UID: \"8de902f3-47bf-470f-8d12-1d5920226652\") " pod="openshift-marketplace/certified-operators-s7zdn" Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.299322 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8de902f3-47bf-470f-8d12-1d5920226652-utilities\") pod \"certified-operators-s7zdn\" (UID: \"8de902f3-47bf-470f-8d12-1d5920226652\") " pod="openshift-marketplace/certified-operators-s7zdn" Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.314733 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xvvh\" (UniqueName: \"kubernetes.io/projected/8de902f3-47bf-470f-8d12-1d5920226652-kube-api-access-4xvvh\") pod \"certified-operators-s7zdn\" (UID: \"8de902f3-47bf-470f-8d12-1d5920226652\") " pod="openshift-marketplace/certified-operators-s7zdn" Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.447366 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s7zdn" Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.466884 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7vgqh"] Mar 19 15:23:55 crc kubenswrapper[4771]: W0319 15:23:55.484010 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce222397_3f74_43e5_85d9_39ed2aa02daf.slice/crio-2d971bde165cc84bb42dc2cab11d7fd82b18d3455d58b74d71b87784d896c171 WatchSource:0}: Error finding container 2d971bde165cc84bb42dc2cab11d7fd82b18d3455d58b74d71b87784d896c171: Status 404 returned error can't find the container with id 2d971bde165cc84bb42dc2cab11d7fd82b18d3455d58b74d71b87784d896c171 Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.515170 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65b4c9ce-e8af-4eca-abf5-08149432aaa5" path="/var/lib/kubelet/pods/65b4c9ce-e8af-4eca-abf5-08149432aaa5/volumes" Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.515922 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f49e754-62f6-4f17-a6ed-fe5e3abe32b4" path="/var/lib/kubelet/pods/7f49e754-62f6-4f17-a6ed-fe5e3abe32b4/volumes" Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.516557 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae9495d8-bbe9-4f54-8c12-56b9f40530e1" path="/var/lib/kubelet/pods/ae9495d8-bbe9-4f54-8c12-56b9f40530e1/volumes" Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.517571 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b49408ed-5087-4cb2-b70e-391c32aad069" path="/var/lib/kubelet/pods/b49408ed-5087-4cb2-b70e-391c32aad069/volumes" Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.518184 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4eb7061-dde4-44f1-943a-219d2f4f5071" path="/var/lib/kubelet/pods/b4eb7061-dde4-44f1-943a-219d2f4f5071/volumes" Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.644899 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s7zdn"] Mar 19 15:23:55 crc kubenswrapper[4771]: W0319 15:23:55.652258 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8de902f3_47bf_470f_8d12_1d5920226652.slice/crio-c900c3cc62183aaa3831719f0012f9c10ec1f1685d3a45ef6baf6e0e37592b00 WatchSource:0}: Error finding container c900c3cc62183aaa3831719f0012f9c10ec1f1685d3a45ef6baf6e0e37592b00: Status 404 returned error can't find the container with id c900c3cc62183aaa3831719f0012f9c10ec1f1685d3a45ef6baf6e0e37592b00 Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.885491 4771 generic.go:334] "Generic (PLEG): container finished" podID="8de902f3-47bf-470f-8d12-1d5920226652" containerID="c0202947417460ade3a6897eebe426e443524d0c28d73f9d7edcce28c2e5033c" exitCode=0 Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.885600 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s7zdn" event={"ID":"8de902f3-47bf-470f-8d12-1d5920226652","Type":"ContainerDied","Data":"c0202947417460ade3a6897eebe426e443524d0c28d73f9d7edcce28c2e5033c"} Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.885634 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s7zdn" event={"ID":"8de902f3-47bf-470f-8d12-1d5920226652","Type":"ContainerStarted","Data":"c900c3cc62183aaa3831719f0012f9c10ec1f1685d3a45ef6baf6e0e37592b00"} Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.887553 4771 generic.go:334] "Generic (PLEG): container finished" podID="ce222397-3f74-43e5-85d9-39ed2aa02daf" containerID="0b54164b31af095daf09a50d3c118f0788cf6d333b164fe55ba7ad8651586406" exitCode=0 Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.887649 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vgqh" event={"ID":"ce222397-3f74-43e5-85d9-39ed2aa02daf","Type":"ContainerDied","Data":"0b54164b31af095daf09a50d3c118f0788cf6d333b164fe55ba7ad8651586406"} Mar 19 15:23:55 crc kubenswrapper[4771]: I0319 15:23:55.887710 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vgqh" event={"ID":"ce222397-3f74-43e5-85d9-39ed2aa02daf","Type":"ContainerStarted","Data":"2d971bde165cc84bb42dc2cab11d7fd82b18d3455d58b74d71b87784d896c171"} Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.326588 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t9tdl"] Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.327944 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t9tdl" Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.330330 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.333364 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t9tdl"] Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.427470 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm289\" (UniqueName: \"kubernetes.io/projected/31742b90-a657-49d5-be4a-7e415211fc0a-kube-api-access-nm289\") pod \"redhat-operators-t9tdl\" (UID: \"31742b90-a657-49d5-be4a-7e415211fc0a\") " pod="openshift-marketplace/redhat-operators-t9tdl" Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.427849 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31742b90-a657-49d5-be4a-7e415211fc0a-utilities\") pod \"redhat-operators-t9tdl\" (UID: \"31742b90-a657-49d5-be4a-7e415211fc0a\") " pod="openshift-marketplace/redhat-operators-t9tdl" Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.428114 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31742b90-a657-49d5-be4a-7e415211fc0a-catalog-content\") pod \"redhat-operators-t9tdl\" (UID: \"31742b90-a657-49d5-be4a-7e415211fc0a\") " pod="openshift-marketplace/redhat-operators-t9tdl" Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.527393 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gxxjr"] Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.528903 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gxxjr" Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.529398 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31742b90-a657-49d5-be4a-7e415211fc0a-utilities\") pod \"redhat-operators-t9tdl\" (UID: \"31742b90-a657-49d5-be4a-7e415211fc0a\") " pod="openshift-marketplace/redhat-operators-t9tdl" Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.529552 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31742b90-a657-49d5-be4a-7e415211fc0a-catalog-content\") pod \"redhat-operators-t9tdl\" (UID: \"31742b90-a657-49d5-be4a-7e415211fc0a\") " pod="openshift-marketplace/redhat-operators-t9tdl" Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.529657 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm289\" (UniqueName: \"kubernetes.io/projected/31742b90-a657-49d5-be4a-7e415211fc0a-kube-api-access-nm289\") pod \"redhat-operators-t9tdl\" (UID: \"31742b90-a657-49d5-be4a-7e415211fc0a\") " pod="openshift-marketplace/redhat-operators-t9tdl" Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.531114 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31742b90-a657-49d5-be4a-7e415211fc0a-utilities\") pod \"redhat-operators-t9tdl\" (UID: \"31742b90-a657-49d5-be4a-7e415211fc0a\") " pod="openshift-marketplace/redhat-operators-t9tdl" Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.531731 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31742b90-a657-49d5-be4a-7e415211fc0a-catalog-content\") pod \"redhat-operators-t9tdl\" (UID: \"31742b90-a657-49d5-be4a-7e415211fc0a\") " pod="openshift-marketplace/redhat-operators-t9tdl" Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.532371 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.533976 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gxxjr"] Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.567607 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm289\" (UniqueName: \"kubernetes.io/projected/31742b90-a657-49d5-be4a-7e415211fc0a-kube-api-access-nm289\") pod \"redhat-operators-t9tdl\" (UID: \"31742b90-a657-49d5-be4a-7e415211fc0a\") " pod="openshift-marketplace/redhat-operators-t9tdl" Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.631538 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbefac82-3474-453a-a991-2746e6b18cd3-utilities\") pod \"community-operators-gxxjr\" (UID: \"cbefac82-3474-453a-a991-2746e6b18cd3\") " pod="openshift-marketplace/community-operators-gxxjr" Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.631915 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbefac82-3474-453a-a991-2746e6b18cd3-catalog-content\") pod \"community-operators-gxxjr\" (UID: \"cbefac82-3474-453a-a991-2746e6b18cd3\") " pod="openshift-marketplace/community-operators-gxxjr" Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.631957 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvz84\" (UniqueName: \"kubernetes.io/projected/cbefac82-3474-453a-a991-2746e6b18cd3-kube-api-access-lvz84\") pod \"community-operators-gxxjr\" (UID: \"cbefac82-3474-453a-a991-2746e6b18cd3\") " pod="openshift-marketplace/community-operators-gxxjr" Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.704304 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t9tdl" Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.734704 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbefac82-3474-453a-a991-2746e6b18cd3-utilities\") pod \"community-operators-gxxjr\" (UID: \"cbefac82-3474-453a-a991-2746e6b18cd3\") " pod="openshift-marketplace/community-operators-gxxjr" Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.734760 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbefac82-3474-453a-a991-2746e6b18cd3-catalog-content\") pod \"community-operators-gxxjr\" (UID: \"cbefac82-3474-453a-a991-2746e6b18cd3\") " pod="openshift-marketplace/community-operators-gxxjr" Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.734787 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvz84\" (UniqueName: \"kubernetes.io/projected/cbefac82-3474-453a-a991-2746e6b18cd3-kube-api-access-lvz84\") pod \"community-operators-gxxjr\" (UID: \"cbefac82-3474-453a-a991-2746e6b18cd3\") " pod="openshift-marketplace/community-operators-gxxjr" Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.735510 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbefac82-3474-453a-a991-2746e6b18cd3-catalog-content\") pod \"community-operators-gxxjr\" (UID: \"cbefac82-3474-453a-a991-2746e6b18cd3\") " pod="openshift-marketplace/community-operators-gxxjr" Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.735598 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbefac82-3474-453a-a991-2746e6b18cd3-utilities\") pod \"community-operators-gxxjr\" (UID: \"cbefac82-3474-453a-a991-2746e6b18cd3\") " pod="openshift-marketplace/community-operators-gxxjr" Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.767854 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvz84\" (UniqueName: \"kubernetes.io/projected/cbefac82-3474-453a-a991-2746e6b18cd3-kube-api-access-lvz84\") pod \"community-operators-gxxjr\" (UID: \"cbefac82-3474-453a-a991-2746e6b18cd3\") " pod="openshift-marketplace/community-operators-gxxjr" Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.856502 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gxxjr" Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.905826 4771 generic.go:334] "Generic (PLEG): container finished" podID="8de902f3-47bf-470f-8d12-1d5920226652" containerID="f3d7e5a7c32925135fa152222a5c8bd0a262e7a42eee2360ff3b1efa76044eb6" exitCode=0 Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.906166 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s7zdn" event={"ID":"8de902f3-47bf-470f-8d12-1d5920226652","Type":"ContainerDied","Data":"f3d7e5a7c32925135fa152222a5c8bd0a262e7a42eee2360ff3b1efa76044eb6"} Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.908920 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vgqh" event={"ID":"ce222397-3f74-43e5-85d9-39ed2aa02daf","Type":"ContainerStarted","Data":"f0cb790a9725d041e7f892095ba6055e90f1cc0f52d8f94414ac1390a772c988"} Mar 19 15:23:57 crc kubenswrapper[4771]: I0319 15:23:57.948942 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t9tdl"] Mar 19 15:23:58 crc kubenswrapper[4771]: I0319 15:23:58.258117 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gxxjr"] Mar 19 15:23:58 crc kubenswrapper[4771]: W0319 15:23:58.269313 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbefac82_3474_453a_a991_2746e6b18cd3.slice/crio-5a4908aaa0f2a4d949d6a9e9fead2696ac171bdaa8c9f3e89b15391e65562757 WatchSource:0}: Error finding container 5a4908aaa0f2a4d949d6a9e9fead2696ac171bdaa8c9f3e89b15391e65562757: Status 404 returned error can't find the container with id 5a4908aaa0f2a4d949d6a9e9fead2696ac171bdaa8c9f3e89b15391e65562757 Mar 19 15:23:58 crc kubenswrapper[4771]: I0319 15:23:58.915275 4771 generic.go:334] "Generic (PLEG): container finished" podID="cbefac82-3474-453a-a991-2746e6b18cd3" containerID="2519a1262bb8671afdd8df1befc1de3440e87bae54fd8ed8fadd5ee06a4010b5" exitCode=0 Mar 19 15:23:58 crc kubenswrapper[4771]: I0319 15:23:58.915372 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gxxjr" event={"ID":"cbefac82-3474-453a-a991-2746e6b18cd3","Type":"ContainerDied","Data":"2519a1262bb8671afdd8df1befc1de3440e87bae54fd8ed8fadd5ee06a4010b5"} Mar 19 15:23:58 crc kubenswrapper[4771]: I0319 15:23:58.915698 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gxxjr" event={"ID":"cbefac82-3474-453a-a991-2746e6b18cd3","Type":"ContainerStarted","Data":"5a4908aaa0f2a4d949d6a9e9fead2696ac171bdaa8c9f3e89b15391e65562757"} Mar 19 15:23:58 crc kubenswrapper[4771]: I0319 15:23:58.919861 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s7zdn" event={"ID":"8de902f3-47bf-470f-8d12-1d5920226652","Type":"ContainerStarted","Data":"9a5bd6173c3779692aa09ed9c096b13162ba76b000574a346ca11ddda9276262"} Mar 19 15:23:58 crc kubenswrapper[4771]: I0319 15:23:58.921415 4771 generic.go:334] "Generic (PLEG): container finished" podID="31742b90-a657-49d5-be4a-7e415211fc0a" containerID="0ce82589d1ff93ebaffbcf52b5e42d3f52fd74be9478f8859f546a2c08dd2bb0" exitCode=0 Mar 19 15:23:58 crc kubenswrapper[4771]: I0319 15:23:58.921441 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t9tdl" event={"ID":"31742b90-a657-49d5-be4a-7e415211fc0a","Type":"ContainerDied","Data":"0ce82589d1ff93ebaffbcf52b5e42d3f52fd74be9478f8859f546a2c08dd2bb0"} Mar 19 15:23:58 crc kubenswrapper[4771]: I0319 15:23:58.921511 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t9tdl" event={"ID":"31742b90-a657-49d5-be4a-7e415211fc0a","Type":"ContainerStarted","Data":"41114c4dfa2576676d7f7a88e94e55f6d4ea1bfeac75cbb10329cb85bc0dadde"} Mar 19 15:23:58 crc kubenswrapper[4771]: I0319 15:23:58.923648 4771 generic.go:334] "Generic (PLEG): container finished" podID="ce222397-3f74-43e5-85d9-39ed2aa02daf" containerID="f0cb790a9725d041e7f892095ba6055e90f1cc0f52d8f94414ac1390a772c988" exitCode=0 Mar 19 15:23:58 crc kubenswrapper[4771]: I0319 15:23:58.923675 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vgqh" event={"ID":"ce222397-3f74-43e5-85d9-39ed2aa02daf","Type":"ContainerDied","Data":"f0cb790a9725d041e7f892095ba6055e90f1cc0f52d8f94414ac1390a772c988"} Mar 19 15:23:58 crc kubenswrapper[4771]: I0319 15:23:58.977268 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s7zdn" podStartSLOduration=1.5209459079999998 podStartE2EDuration="3.977253635s" podCreationTimestamp="2026-03-19 15:23:55 +0000 UTC" firstStartedPulling="2026-03-19 15:23:55.887926332 +0000 UTC m=+495.116547544" lastFinishedPulling="2026-03-19 15:23:58.344234049 +0000 UTC m=+497.572855271" observedRunningTime="2026-03-19 15:23:58.975520681 +0000 UTC m=+498.204141883" watchObservedRunningTime="2026-03-19 15:23:58.977253635 +0000 UTC m=+498.205874837" Mar 19 15:23:59 crc kubenswrapper[4771]: I0319 15:23:59.737720 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" podUID="e2f99f52-00ff-42f0-a2ee-122235c86b2b" containerName="registry" containerID="cri-o://2841347aafecaf651da70723292ec8c3f5c9e21c870e998848b82b50e9aa2c47" gracePeriod=30 Mar 19 15:23:59 crc kubenswrapper[4771]: I0319 15:23:59.937121 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vgqh" event={"ID":"ce222397-3f74-43e5-85d9-39ed2aa02daf","Type":"ContainerStarted","Data":"5b54cc312174cd5620614a8ee226e065ee1cc09acd72b4f5bdfd0a2c2261644e"} Mar 19 15:23:59 crc kubenswrapper[4771]: I0319 15:23:59.939051 4771 generic.go:334] "Generic (PLEG): container finished" podID="e2f99f52-00ff-42f0-a2ee-122235c86b2b" containerID="2841347aafecaf651da70723292ec8c3f5c9e21c870e998848b82b50e9aa2c47" exitCode=0 Mar 19 15:23:59 crc kubenswrapper[4771]: I0319 15:23:59.939108 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" event={"ID":"e2f99f52-00ff-42f0-a2ee-122235c86b2b","Type":"ContainerDied","Data":"2841347aafecaf651da70723292ec8c3f5c9e21c870e998848b82b50e9aa2c47"} Mar 19 15:23:59 crc kubenswrapper[4771]: I0319 15:23:59.941056 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t9tdl" event={"ID":"31742b90-a657-49d5-be4a-7e415211fc0a","Type":"ContainerStarted","Data":"008ee36621c45fddc076dfea2c90a4979dadcf5df0155e4f67d8754f43b3db4f"} Mar 19 15:23:59 crc kubenswrapper[4771]: I0319 15:23:59.975215 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7vgqh" podStartSLOduration=2.528081091 podStartE2EDuration="5.975196703s" podCreationTimestamp="2026-03-19 15:23:54 +0000 UTC" firstStartedPulling="2026-03-19 15:23:55.891726699 +0000 UTC m=+495.120347901" lastFinishedPulling="2026-03-19 15:23:59.338842271 +0000 UTC m=+498.567463513" observedRunningTime="2026-03-19 15:23:59.956074444 +0000 UTC m=+499.184695656" watchObservedRunningTime="2026-03-19 15:23:59.975196703 +0000 UTC m=+499.203817915" Mar 19 15:24:00 crc kubenswrapper[4771]: I0319 15:24:00.131402 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565564-vdfl8"] Mar 19 15:24:00 crc kubenswrapper[4771]: I0319 15:24:00.132353 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565564-vdfl8" Mar 19 15:24:00 crc kubenswrapper[4771]: I0319 15:24:00.134156 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 15:24:00 crc kubenswrapper[4771]: I0319 15:24:00.134375 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k42k7" Mar 19 15:24:00 crc kubenswrapper[4771]: I0319 15:24:00.134959 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 15:24:00 crc kubenswrapper[4771]: I0319 15:24:00.141426 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565564-vdfl8"] Mar 19 15:24:00 crc kubenswrapper[4771]: I0319 15:24:00.179775 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:24:00 crc kubenswrapper[4771]: I0319 15:24:00.270221 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " Mar 19 15:24:00 crc kubenswrapper[4771]: I0319 15:24:00.270291 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2f99f52-00ff-42f0-a2ee-122235c86b2b-installation-pull-secrets\") pod \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " Mar 19 15:24:00 crc kubenswrapper[4771]: I0319 15:24:00.270340 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2f99f52-00ff-42f0-a2ee-122235c86b2b-registry-certificates\") pod \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " Mar 19 15:24:00 crc kubenswrapper[4771]: I0319 15:24:00.270359 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2f99f52-00ff-42f0-a2ee-122235c86b2b-trusted-ca\") pod \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " Mar 19 15:24:00 crc kubenswrapper[4771]: I0319 15:24:00.270376 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5f9v\" (UniqueName: \"kubernetes.io/projected/e2f99f52-00ff-42f0-a2ee-122235c86b2b-kube-api-access-k5f9v\") pod \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " Mar 19 15:24:00 crc kubenswrapper[4771]: I0319 15:24:00.270391 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2f99f52-00ff-42f0-a2ee-122235c86b2b-bound-sa-token\") pod \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " Mar 19 15:24:00 crc kubenswrapper[4771]: I0319 15:24:00.270431 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2f99f52-00ff-42f0-a2ee-122235c86b2b-registry-tls\") pod \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " Mar 19 15:24:00 crc kubenswrapper[4771]: I0319 15:24:00.270451 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2f99f52-00ff-42f0-a2ee-122235c86b2b-ca-trust-extracted\") pod \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\" (UID: \"e2f99f52-00ff-42f0-a2ee-122235c86b2b\") " Mar 19 15:24:00 crc kubenswrapper[4771]: I0319 15:24:00.270616 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttrgs\" (UniqueName: \"kubernetes.io/projected/e575b8dd-4c4b-446c-83c7-8fde3ba656ec-kube-api-access-ttrgs\") pod \"auto-csr-approver-29565564-vdfl8\" (UID: \"e575b8dd-4c4b-446c-83c7-8fde3ba656ec\") " pod="openshift-infra/auto-csr-approver-29565564-vdfl8" Mar 19 15:24:00 crc kubenswrapper[4771]: I0319 15:24:00.271009 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2f99f52-00ff-42f0-a2ee-122235c86b2b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e2f99f52-00ff-42f0-a2ee-122235c86b2b" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:24:00 crc kubenswrapper[4771]: I0319 15:24:00.271418 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2f99f52-00ff-42f0-a2ee-122235c86b2b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e2f99f52-00ff-42f0-a2ee-122235c86b2b" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:24:00 crc kubenswrapper[4771]: I0319 15:24:00.276130 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2f99f52-00ff-42f0-a2ee-122235c86b2b-kube-api-access-k5f9v" (OuterVolumeSpecName: "kube-api-access-k5f9v") pod "e2f99f52-00ff-42f0-a2ee-122235c86b2b" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b"). InnerVolumeSpecName "kube-api-access-k5f9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:24:00 crc kubenswrapper[4771]: I0319 15:24:00.277250 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2f99f52-00ff-42f0-a2ee-122235c86b2b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e2f99f52-00ff-42f0-a2ee-122235c86b2b" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:24:00 crc kubenswrapper[4771]: I0319 15:24:00.280601 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2f99f52-00ff-42f0-a2ee-122235c86b2b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e2f99f52-00ff-42f0-a2ee-122235c86b2b" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:24:00 crc kubenswrapper[4771]: I0319 15:24:00.280794 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2f99f52-00ff-42f0-a2ee-122235c86b2b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e2f99f52-00ff-42f0-a2ee-122235c86b2b" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:24:01 crc kubenswrapper[4771]: I0319 15:24:00.288582 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "e2f99f52-00ff-42f0-a2ee-122235c86b2b" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 15:24:01 crc kubenswrapper[4771]: I0319 15:24:00.290086 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2f99f52-00ff-42f0-a2ee-122235c86b2b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e2f99f52-00ff-42f0-a2ee-122235c86b2b" (UID: "e2f99f52-00ff-42f0-a2ee-122235c86b2b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:24:01 crc kubenswrapper[4771]: I0319 15:24:00.371353 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttrgs\" (UniqueName: \"kubernetes.io/projected/e575b8dd-4c4b-446c-83c7-8fde3ba656ec-kube-api-access-ttrgs\") pod \"auto-csr-approver-29565564-vdfl8\" (UID: \"e575b8dd-4c4b-446c-83c7-8fde3ba656ec\") " pod="openshift-infra/auto-csr-approver-29565564-vdfl8" Mar 19 15:24:01 crc kubenswrapper[4771]: I0319 15:24:00.371454 4771 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2f99f52-00ff-42f0-a2ee-122235c86b2b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 19 15:24:01 crc kubenswrapper[4771]: I0319 15:24:00.371467 4771 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2f99f52-00ff-42f0-a2ee-122235c86b2b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 19 15:24:01 crc kubenswrapper[4771]: I0319 15:24:00.371477 4771 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2f99f52-00ff-42f0-a2ee-122235c86b2b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 19 15:24:01 crc kubenswrapper[4771]: I0319 15:24:00.371487 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2f99f52-00ff-42f0-a2ee-122235c86b2b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 15:24:01 crc kubenswrapper[4771]: I0319 15:24:00.371495 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5f9v\" (UniqueName: \"kubernetes.io/projected/e2f99f52-00ff-42f0-a2ee-122235c86b2b-kube-api-access-k5f9v\") on node \"crc\" DevicePath \"\"" Mar 19 15:24:01 crc kubenswrapper[4771]: I0319 15:24:00.371503 4771 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2f99f52-00ff-42f0-a2ee-122235c86b2b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 15:24:01 crc kubenswrapper[4771]: I0319 15:24:00.371511 4771 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2f99f52-00ff-42f0-a2ee-122235c86b2b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 19 15:24:01 crc kubenswrapper[4771]: I0319 15:24:00.388855 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttrgs\" (UniqueName: \"kubernetes.io/projected/e575b8dd-4c4b-446c-83c7-8fde3ba656ec-kube-api-access-ttrgs\") pod \"auto-csr-approver-29565564-vdfl8\" (UID: \"e575b8dd-4c4b-446c-83c7-8fde3ba656ec\") " pod="openshift-infra/auto-csr-approver-29565564-vdfl8" Mar 19 15:24:01 crc kubenswrapper[4771]: I0319 15:24:00.477634 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565564-vdfl8" Mar 19 15:24:01 crc kubenswrapper[4771]: I0319 15:24:00.959429 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" event={"ID":"e2f99f52-00ff-42f0-a2ee-122235c86b2b","Type":"ContainerDied","Data":"3aad2894516c06b69764cb2abf60bea5f53f4d659ea3ef0ff9e4c39a2e3e05ec"} Mar 19 15:24:01 crc kubenswrapper[4771]: I0319 15:24:00.959640 4771 scope.go:117] "RemoveContainer" containerID="2841347aafecaf651da70723292ec8c3f5c9e21c870e998848b82b50e9aa2c47" Mar 19 15:24:01 crc kubenswrapper[4771]: I0319 15:24:00.959469 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hcvsv" Mar 19 15:24:01 crc kubenswrapper[4771]: I0319 15:24:00.982240 4771 generic.go:334] "Generic (PLEG): container finished" podID="31742b90-a657-49d5-be4a-7e415211fc0a" containerID="008ee36621c45fddc076dfea2c90a4979dadcf5df0155e4f67d8754f43b3db4f" exitCode=0 Mar 19 15:24:01 crc kubenswrapper[4771]: I0319 15:24:00.982358 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t9tdl" event={"ID":"31742b90-a657-49d5-be4a-7e415211fc0a","Type":"ContainerDied","Data":"008ee36621c45fddc076dfea2c90a4979dadcf5df0155e4f67d8754f43b3db4f"} Mar 19 15:24:01 crc kubenswrapper[4771]: I0319 15:24:00.986348 4771 generic.go:334] "Generic (PLEG): container finished" podID="cbefac82-3474-453a-a991-2746e6b18cd3" containerID="6da73f59d5064111097f092c45f16ffcf335bcf51263f12e26f9aebdac8ba67b" exitCode=0 Mar 19 15:24:01 crc kubenswrapper[4771]: I0319 15:24:00.986468 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gxxjr" event={"ID":"cbefac82-3474-453a-a991-2746e6b18cd3","Type":"ContainerDied","Data":"6da73f59d5064111097f092c45f16ffcf335bcf51263f12e26f9aebdac8ba67b"} Mar 19 15:24:01 crc kubenswrapper[4771]: I0319 15:24:01.037032 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hcvsv"] Mar 19 15:24:01 crc kubenswrapper[4771]: I0319 15:24:01.047979 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hcvsv"] Mar 19 15:24:01 crc kubenswrapper[4771]: I0319 15:24:01.180168 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565564-vdfl8"] Mar 19 15:24:01 crc kubenswrapper[4771]: W0319 15:24:01.191712 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode575b8dd_4c4b_446c_83c7_8fde3ba656ec.slice/crio-52aba89f158ab16f2629c1912f0d7193605ca7da9bbd0485f7990ff03a2ffcf9 WatchSource:0}: Error finding container 52aba89f158ab16f2629c1912f0d7193605ca7da9bbd0485f7990ff03a2ffcf9: Status 404 returned error can't find the container with id 52aba89f158ab16f2629c1912f0d7193605ca7da9bbd0485f7990ff03a2ffcf9 Mar 19 15:24:01 crc kubenswrapper[4771]: I0319 15:24:01.524902 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2f99f52-00ff-42f0-a2ee-122235c86b2b" path="/var/lib/kubelet/pods/e2f99f52-00ff-42f0-a2ee-122235c86b2b/volumes" Mar 19 15:24:01 crc kubenswrapper[4771]: I0319 15:24:01.996298 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t9tdl" event={"ID":"31742b90-a657-49d5-be4a-7e415211fc0a","Type":"ContainerStarted","Data":"d3e3e8d3e84c8e3981306c42dc3d268d97ee7feaf61ebe183b247fbb77d4cf9a"} Mar 19 15:24:01 crc kubenswrapper[4771]: I0319 15:24:01.998890 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gxxjr" event={"ID":"cbefac82-3474-453a-a991-2746e6b18cd3","Type":"ContainerStarted","Data":"f8f2894a801ecf131a3d3b1c21c70a2dcf425c813f36e12c8121487c1cf7e3e1"} Mar 19 15:24:01 crc kubenswrapper[4771]: I0319 15:24:01.999762 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565564-vdfl8" event={"ID":"e575b8dd-4c4b-446c-83c7-8fde3ba656ec","Type":"ContainerStarted","Data":"52aba89f158ab16f2629c1912f0d7193605ca7da9bbd0485f7990ff03a2ffcf9"} Mar 19 15:24:02 crc kubenswrapper[4771]: I0319 15:24:02.025946 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t9tdl" podStartSLOduration=2.456512819 podStartE2EDuration="5.025930219s" podCreationTimestamp="2026-03-19 15:23:57 +0000 UTC" firstStartedPulling="2026-03-19 15:23:58.924463185 +0000 UTC m=+498.153084387" lastFinishedPulling="2026-03-19 15:24:01.493880575 +0000 UTC m=+500.722501787" observedRunningTime="2026-03-19 15:24:02.023170819 +0000 UTC m=+501.251792021" watchObservedRunningTime="2026-03-19 15:24:02.025930219 +0000 UTC m=+501.254551431" Mar 19 15:24:02 crc kubenswrapper[4771]: I0319 15:24:02.051492 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gxxjr" podStartSLOduration=2.579043973 podStartE2EDuration="5.051470343s" podCreationTimestamp="2026-03-19 15:23:57 +0000 UTC" firstStartedPulling="2026-03-19 15:23:58.917165189 +0000 UTC m=+498.145786411" lastFinishedPulling="2026-03-19 15:24:01.389591579 +0000 UTC m=+500.618212781" observedRunningTime="2026-03-19 15:24:02.047348998 +0000 UTC m=+501.275970200" watchObservedRunningTime="2026-03-19 15:24:02.051470343 +0000 UTC m=+501.280091575" Mar 19 15:24:03 crc kubenswrapper[4771]: I0319 15:24:03.006224 4771 generic.go:334] "Generic (PLEG): container finished" podID="e575b8dd-4c4b-446c-83c7-8fde3ba656ec" containerID="e2905b910893d2344f486917c4c4388c01a5395264f73c1be0ceca8922ed821c" exitCode=0 Mar 19 15:24:03 crc kubenswrapper[4771]: I0319 15:24:03.006310 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565564-vdfl8" event={"ID":"e575b8dd-4c4b-446c-83c7-8fde3ba656ec","Type":"ContainerDied","Data":"e2905b910893d2344f486917c4c4388c01a5395264f73c1be0ceca8922ed821c"} Mar 19 15:24:04 crc kubenswrapper[4771]: I0319 15:24:04.344122 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565564-vdfl8" Mar 19 15:24:04 crc kubenswrapper[4771]: I0319 15:24:04.431267 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttrgs\" (UniqueName: \"kubernetes.io/projected/e575b8dd-4c4b-446c-83c7-8fde3ba656ec-kube-api-access-ttrgs\") pod \"e575b8dd-4c4b-446c-83c7-8fde3ba656ec\" (UID: \"e575b8dd-4c4b-446c-83c7-8fde3ba656ec\") " Mar 19 15:24:04 crc kubenswrapper[4771]: I0319 15:24:04.438469 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e575b8dd-4c4b-446c-83c7-8fde3ba656ec-kube-api-access-ttrgs" (OuterVolumeSpecName: "kube-api-access-ttrgs") pod "e575b8dd-4c4b-446c-83c7-8fde3ba656ec" (UID: "e575b8dd-4c4b-446c-83c7-8fde3ba656ec"). InnerVolumeSpecName "kube-api-access-ttrgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:24:04 crc kubenswrapper[4771]: I0319 15:24:04.532591 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttrgs\" (UniqueName: \"kubernetes.io/projected/e575b8dd-4c4b-446c-83c7-8fde3ba656ec-kube-api-access-ttrgs\") on node \"crc\" DevicePath \"\"" Mar 19 15:24:05 crc kubenswrapper[4771]: I0319 15:24:05.021776 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565564-vdfl8" event={"ID":"e575b8dd-4c4b-446c-83c7-8fde3ba656ec","Type":"ContainerDied","Data":"52aba89f158ab16f2629c1912f0d7193605ca7da9bbd0485f7990ff03a2ffcf9"} Mar 19 15:24:05 crc kubenswrapper[4771]: I0319 15:24:05.021836 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52aba89f158ab16f2629c1912f0d7193605ca7da9bbd0485f7990ff03a2ffcf9" Mar 19 15:24:05 crc kubenswrapper[4771]: I0319 15:24:05.021915 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565564-vdfl8" Mar 19 15:24:05 crc kubenswrapper[4771]: I0319 15:24:05.265118 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7vgqh" Mar 19 15:24:05 crc kubenswrapper[4771]: I0319 15:24:05.266439 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7vgqh" Mar 19 15:24:05 crc kubenswrapper[4771]: I0319 15:24:05.341059 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7vgqh" Mar 19 15:24:05 crc kubenswrapper[4771]: I0319 15:24:05.419965 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565558-wvlb8"] Mar 19 15:24:05 crc kubenswrapper[4771]: I0319 15:24:05.426825 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565558-wvlb8"] Mar 19 15:24:05 crc kubenswrapper[4771]: I0319 15:24:05.447562 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s7zdn" Mar 19 15:24:05 crc kubenswrapper[4771]: I0319 15:24:05.447606 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s7zdn" Mar 19 15:24:05 crc kubenswrapper[4771]: I0319 15:24:05.499093 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s7zdn" Mar 19 15:24:05 crc kubenswrapper[4771]: I0319 15:24:05.526479 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af3ce0f9-bc02-4142-8655-9751fe9197db" path="/var/lib/kubelet/pods/af3ce0f9-bc02-4142-8655-9751fe9197db/volumes" Mar 19 15:24:06 crc kubenswrapper[4771]: I0319 15:24:06.080838 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s7zdn" Mar 19 15:24:06 crc kubenswrapper[4771]: I0319 15:24:06.088568 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7vgqh" Mar 19 15:24:07 crc kubenswrapper[4771]: I0319 15:24:07.705342 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t9tdl" Mar 19 15:24:07 crc kubenswrapper[4771]: I0319 15:24:07.705859 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t9tdl" Mar 19 15:24:07 crc kubenswrapper[4771]: I0319 15:24:07.856800 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gxxjr" Mar 19 15:24:07 crc kubenswrapper[4771]: I0319 15:24:07.857101 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gxxjr" Mar 19 15:24:07 crc kubenswrapper[4771]: I0319 15:24:07.891998 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gxxjr" Mar 19 15:24:08 crc kubenswrapper[4771]: I0319 15:24:08.086647 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gxxjr" Mar 19 15:24:08 crc kubenswrapper[4771]: I0319 15:24:08.761468 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t9tdl" podUID="31742b90-a657-49d5-be4a-7e415211fc0a" containerName="registry-server" probeResult="failure" output=< Mar 19 15:24:08 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Mar 19 15:24:08 crc kubenswrapper[4771]: > Mar 19 15:24:09 crc kubenswrapper[4771]: I0319 15:24:09.716677 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 15:24:17 crc kubenswrapper[4771]: I0319 15:24:17.771425 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t9tdl" Mar 19 15:24:17 crc kubenswrapper[4771]: I0319 15:24:17.854951 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t9tdl" Mar 19 15:24:23 crc kubenswrapper[4771]: I0319 15:24:23.027821 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:24:23 crc kubenswrapper[4771]: I0319 15:24:23.028490 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:24:23 crc kubenswrapper[4771]: I0319 15:24:23.028577 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" Mar 19 15:24:23 crc kubenswrapper[4771]: I0319 15:24:23.029663 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c4dbfc80f1f21c45267b8baa63986792b0ac71a0dd8823637031f7df0184802e"} pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 15:24:23 crc kubenswrapper[4771]: I0319 15:24:23.030045 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" containerID="cri-o://c4dbfc80f1f21c45267b8baa63986792b0ac71a0dd8823637031f7df0184802e" gracePeriod=600 Mar 19 15:24:24 crc kubenswrapper[4771]: I0319 15:24:24.142121 4771 generic.go:334] "Generic (PLEG): container finished" podID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerID="c4dbfc80f1f21c45267b8baa63986792b0ac71a0dd8823637031f7df0184802e" exitCode=0 Mar 19 15:24:24 crc kubenswrapper[4771]: I0319 15:24:24.142254 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" event={"ID":"f2b6e948-bbef-4217-b0eb-4cdbf711037c","Type":"ContainerDied","Data":"c4dbfc80f1f21c45267b8baa63986792b0ac71a0dd8823637031f7df0184802e"} Mar 19 15:24:24 crc kubenswrapper[4771]: I0319 15:24:24.143332 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" event={"ID":"f2b6e948-bbef-4217-b0eb-4cdbf711037c","Type":"ContainerStarted","Data":"41fe4b028ed4c1241b67194aaa2a009f141466dd206828b233686513e2dbdf58"} Mar 19 15:24:24 crc kubenswrapper[4771]: I0319 15:24:24.143371 4771 scope.go:117] "RemoveContainer" containerID="505679622b5d316cee380ad3b151c460658f70872e83c1d6089d7173618c3e93" Mar 19 15:26:00 crc kubenswrapper[4771]: I0319 15:26:00.135046 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565566-cpkp9"] Mar 19 15:26:00 crc kubenswrapper[4771]: E0319 15:26:00.136927 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f99f52-00ff-42f0-a2ee-122235c86b2b" containerName="registry" Mar 19 15:26:00 crc kubenswrapper[4771]: I0319 15:26:00.137790 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f99f52-00ff-42f0-a2ee-122235c86b2b" containerName="registry" Mar 19 15:26:00 crc kubenswrapper[4771]: E0319 15:26:00.137907 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e575b8dd-4c4b-446c-83c7-8fde3ba656ec" containerName="oc" Mar 19 15:26:00 crc kubenswrapper[4771]: I0319 15:26:00.137977 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e575b8dd-4c4b-446c-83c7-8fde3ba656ec" containerName="oc" Mar 19 15:26:00 crc kubenswrapper[4771]: I0319 15:26:00.138374 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e575b8dd-4c4b-446c-83c7-8fde3ba656ec" containerName="oc" Mar 19 15:26:00 crc kubenswrapper[4771]: I0319 15:26:00.138519 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2f99f52-00ff-42f0-a2ee-122235c86b2b" containerName="registry" Mar 19 15:26:00 crc kubenswrapper[4771]: I0319 15:26:00.139401 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565566-cpkp9" Mar 19 15:26:00 crc kubenswrapper[4771]: I0319 15:26:00.142713 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 15:26:00 crc kubenswrapper[4771]: I0319 15:26:00.143058 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 15:26:00 crc kubenswrapper[4771]: I0319 15:26:00.142937 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565566-cpkp9"] Mar 19 15:26:00 crc kubenswrapper[4771]: I0319 15:26:00.142829 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k42k7" Mar 19 15:26:00 crc kubenswrapper[4771]: I0319 15:26:00.198202 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj8vg\" (UniqueName: \"kubernetes.io/projected/51ce2534-ab98-47b1-8311-eb4f4d13e0dc-kube-api-access-vj8vg\") pod \"auto-csr-approver-29565566-cpkp9\" (UID: \"51ce2534-ab98-47b1-8311-eb4f4d13e0dc\") " pod="openshift-infra/auto-csr-approver-29565566-cpkp9" Mar 19 15:26:00 crc kubenswrapper[4771]: I0319 15:26:00.300902 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj8vg\" (UniqueName: \"kubernetes.io/projected/51ce2534-ab98-47b1-8311-eb4f4d13e0dc-kube-api-access-vj8vg\") pod \"auto-csr-approver-29565566-cpkp9\" (UID: \"51ce2534-ab98-47b1-8311-eb4f4d13e0dc\") " pod="openshift-infra/auto-csr-approver-29565566-cpkp9" Mar 19 15:26:00 crc kubenswrapper[4771]: I0319 15:26:00.337477 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj8vg\" (UniqueName: \"kubernetes.io/projected/51ce2534-ab98-47b1-8311-eb4f4d13e0dc-kube-api-access-vj8vg\") pod \"auto-csr-approver-29565566-cpkp9\" (UID: \"51ce2534-ab98-47b1-8311-eb4f4d13e0dc\") " pod="openshift-infra/auto-csr-approver-29565566-cpkp9" Mar 19 15:26:00 crc kubenswrapper[4771]: I0319 15:26:00.470570 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565566-cpkp9" Mar 19 15:26:00 crc kubenswrapper[4771]: I0319 15:26:00.705776 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565566-cpkp9"] Mar 19 15:26:00 crc kubenswrapper[4771]: I0319 15:26:00.713065 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 15:26:00 crc kubenswrapper[4771]: I0319 15:26:00.803870 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565566-cpkp9" event={"ID":"51ce2534-ab98-47b1-8311-eb4f4d13e0dc","Type":"ContainerStarted","Data":"749894389be05e6298125ec2a4fc92ae8426662ce24d5010e08d4fc40fb2225e"} Mar 19 15:26:02 crc kubenswrapper[4771]: I0319 15:26:02.817229 4771 generic.go:334] "Generic (PLEG): container finished" podID="51ce2534-ab98-47b1-8311-eb4f4d13e0dc" containerID="ee2c415f18a1be2bc58bdb26c16bff0696ade8ab40ab2f887a13888f13682768" exitCode=0 Mar 19 15:26:02 crc kubenswrapper[4771]: I0319 15:26:02.817318 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565566-cpkp9" event={"ID":"51ce2534-ab98-47b1-8311-eb4f4d13e0dc","Type":"ContainerDied","Data":"ee2c415f18a1be2bc58bdb26c16bff0696ade8ab40ab2f887a13888f13682768"} Mar 19 15:26:04 crc kubenswrapper[4771]: I0319 15:26:04.106754 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565566-cpkp9" Mar 19 15:26:04 crc kubenswrapper[4771]: I0319 15:26:04.152395 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vj8vg\" (UniqueName: \"kubernetes.io/projected/51ce2534-ab98-47b1-8311-eb4f4d13e0dc-kube-api-access-vj8vg\") pod \"51ce2534-ab98-47b1-8311-eb4f4d13e0dc\" (UID: \"51ce2534-ab98-47b1-8311-eb4f4d13e0dc\") " Mar 19 15:26:04 crc kubenswrapper[4771]: I0319 15:26:04.159089 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51ce2534-ab98-47b1-8311-eb4f4d13e0dc-kube-api-access-vj8vg" (OuterVolumeSpecName: "kube-api-access-vj8vg") pod "51ce2534-ab98-47b1-8311-eb4f4d13e0dc" (UID: "51ce2534-ab98-47b1-8311-eb4f4d13e0dc"). InnerVolumeSpecName "kube-api-access-vj8vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:26:04 crc kubenswrapper[4771]: I0319 15:26:04.253343 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vj8vg\" (UniqueName: \"kubernetes.io/projected/51ce2534-ab98-47b1-8311-eb4f4d13e0dc-kube-api-access-vj8vg\") on node \"crc\" DevicePath \"\"" Mar 19 15:26:04 crc kubenswrapper[4771]: I0319 15:26:04.831803 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565566-cpkp9" event={"ID":"51ce2534-ab98-47b1-8311-eb4f4d13e0dc","Type":"ContainerDied","Data":"749894389be05e6298125ec2a4fc92ae8426662ce24d5010e08d4fc40fb2225e"} Mar 19 15:26:04 crc kubenswrapper[4771]: I0319 15:26:04.831864 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="749894389be05e6298125ec2a4fc92ae8426662ce24d5010e08d4fc40fb2225e" Mar 19 15:26:04 crc kubenswrapper[4771]: I0319 15:26:04.831891 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565566-cpkp9" Mar 19 15:26:05 crc kubenswrapper[4771]: I0319 15:26:05.181433 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565560-5rgwr"] Mar 19 15:26:05 crc kubenswrapper[4771]: I0319 15:26:05.187858 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565560-5rgwr"] Mar 19 15:26:05 crc kubenswrapper[4771]: I0319 15:26:05.521737 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c681a9f8-ad65-46af-b5a2-3ea110cda37f" path="/var/lib/kubelet/pods/c681a9f8-ad65-46af-b5a2-3ea110cda37f/volumes" Mar 19 15:26:23 crc kubenswrapper[4771]: I0319 15:26:23.027707 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:26:23 crc kubenswrapper[4771]: I0319 15:26:23.028324 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:26:41 crc kubenswrapper[4771]: I0319 15:26:41.977330 4771 scope.go:117] "RemoveContainer" containerID="8c83bee82a9df53269fc854fe79a2c739db69bbf5a840b57b873b8a056b01e5a" Mar 19 15:26:42 crc kubenswrapper[4771]: I0319 15:26:42.008593 4771 scope.go:117] "RemoveContainer" containerID="eefa995bf453314bfdaac302525fa3aafee26a1072abb2daf857895c98c1e5d6" Mar 19 15:26:53 crc kubenswrapper[4771]: I0319 15:26:53.027378 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:26:53 crc kubenswrapper[4771]: I0319 15:26:53.027980 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:27:23 crc kubenswrapper[4771]: I0319 15:27:23.027941 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:27:23 crc kubenswrapper[4771]: I0319 15:27:23.028843 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:27:23 crc kubenswrapper[4771]: I0319 15:27:23.028927 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" Mar 19 15:27:23 crc kubenswrapper[4771]: I0319 15:27:23.029886 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"41fe4b028ed4c1241b67194aaa2a009f141466dd206828b233686513e2dbdf58"} pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 15:27:23 crc kubenswrapper[4771]: I0319 15:27:23.030029 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" containerID="cri-o://41fe4b028ed4c1241b67194aaa2a009f141466dd206828b233686513e2dbdf58" gracePeriod=600 Mar 19 15:27:23 crc kubenswrapper[4771]: I0319 15:27:23.339630 4771 generic.go:334] "Generic (PLEG): container finished" podID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerID="41fe4b028ed4c1241b67194aaa2a009f141466dd206828b233686513e2dbdf58" exitCode=0 Mar 19 15:27:23 crc kubenswrapper[4771]: I0319 15:27:23.339709 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" event={"ID":"f2b6e948-bbef-4217-b0eb-4cdbf711037c","Type":"ContainerDied","Data":"41fe4b028ed4c1241b67194aaa2a009f141466dd206828b233686513e2dbdf58"} Mar 19 15:27:23 crc kubenswrapper[4771]: I0319 15:27:23.340048 4771 scope.go:117] "RemoveContainer" containerID="c4dbfc80f1f21c45267b8baa63986792b0ac71a0dd8823637031f7df0184802e" Mar 19 15:27:24 crc kubenswrapper[4771]: I0319 15:27:24.351345 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" event={"ID":"f2b6e948-bbef-4217-b0eb-4cdbf711037c","Type":"ContainerStarted","Data":"b1c74e779cae295d8261f06e3e0b3804798c6e48c30f016b0eaeacf144fab8c8"} Mar 19 15:28:00 crc kubenswrapper[4771]: I0319 15:28:00.148875 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565568-fwdhr"] Mar 19 15:28:00 crc kubenswrapper[4771]: E0319 15:28:00.150257 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ce2534-ab98-47b1-8311-eb4f4d13e0dc" containerName="oc" Mar 19 15:28:00 crc kubenswrapper[4771]: I0319 15:28:00.150295 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ce2534-ab98-47b1-8311-eb4f4d13e0dc" containerName="oc" Mar 19 15:28:00 crc kubenswrapper[4771]: I0319 15:28:00.150565 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="51ce2534-ab98-47b1-8311-eb4f4d13e0dc" containerName="oc" Mar 19 15:28:00 crc kubenswrapper[4771]: I0319 15:28:00.152833 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565568-fwdhr" Mar 19 15:28:00 crc kubenswrapper[4771]: I0319 15:28:00.156756 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 15:28:00 crc kubenswrapper[4771]: I0319 15:28:00.157195 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k42k7" Mar 19 15:28:00 crc kubenswrapper[4771]: I0319 15:28:00.157193 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 15:28:00 crc kubenswrapper[4771]: I0319 15:28:00.157738 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565568-fwdhr"] Mar 19 15:28:00 crc kubenswrapper[4771]: I0319 15:28:00.227348 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq76g\" (UniqueName: \"kubernetes.io/projected/68217883-a964-4b84-880a-6ac714e5e58e-kube-api-access-vq76g\") pod \"auto-csr-approver-29565568-fwdhr\" (UID: \"68217883-a964-4b84-880a-6ac714e5e58e\") " pod="openshift-infra/auto-csr-approver-29565568-fwdhr" Mar 19 15:28:00 crc kubenswrapper[4771]: I0319 15:28:00.328901 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq76g\" (UniqueName: \"kubernetes.io/projected/68217883-a964-4b84-880a-6ac714e5e58e-kube-api-access-vq76g\") pod \"auto-csr-approver-29565568-fwdhr\" (UID: \"68217883-a964-4b84-880a-6ac714e5e58e\") " pod="openshift-infra/auto-csr-approver-29565568-fwdhr" Mar 19 15:28:00 crc kubenswrapper[4771]: I0319 15:28:00.368490 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq76g\" (UniqueName: \"kubernetes.io/projected/68217883-a964-4b84-880a-6ac714e5e58e-kube-api-access-vq76g\") pod \"auto-csr-approver-29565568-fwdhr\" (UID: \"68217883-a964-4b84-880a-6ac714e5e58e\") " pod="openshift-infra/auto-csr-approver-29565568-fwdhr" Mar 19 15:28:00 crc kubenswrapper[4771]: I0319 15:28:00.481459 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565568-fwdhr" Mar 19 15:28:00 crc kubenswrapper[4771]: I0319 15:28:00.889537 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565568-fwdhr"] Mar 19 15:28:01 crc kubenswrapper[4771]: I0319 15:28:01.636314 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565568-fwdhr" event={"ID":"68217883-a964-4b84-880a-6ac714e5e58e","Type":"ContainerStarted","Data":"d194bb241900cefc91ee9bf1f2a94ce40685b0ab14cc7dd82f590d1bd763cbbb"} Mar 19 15:28:02 crc kubenswrapper[4771]: I0319 15:28:02.646857 4771 generic.go:334] "Generic (PLEG): container finished" podID="68217883-a964-4b84-880a-6ac714e5e58e" containerID="3e4269d0a352b4a88f7c4b744aa306758accf3c08d237ea617934a623dc02536" exitCode=0 Mar 19 15:28:02 crc kubenswrapper[4771]: I0319 15:28:02.647025 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565568-fwdhr" event={"ID":"68217883-a964-4b84-880a-6ac714e5e58e","Type":"ContainerDied","Data":"3e4269d0a352b4a88f7c4b744aa306758accf3c08d237ea617934a623dc02536"} Mar 19 15:28:03 crc kubenswrapper[4771]: I0319 15:28:03.980197 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565568-fwdhr" Mar 19 15:28:04 crc kubenswrapper[4771]: I0319 15:28:04.084845 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq76g\" (UniqueName: \"kubernetes.io/projected/68217883-a964-4b84-880a-6ac714e5e58e-kube-api-access-vq76g\") pod \"68217883-a964-4b84-880a-6ac714e5e58e\" (UID: \"68217883-a964-4b84-880a-6ac714e5e58e\") " Mar 19 15:28:04 crc kubenswrapper[4771]: I0319 15:28:04.091100 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68217883-a964-4b84-880a-6ac714e5e58e-kube-api-access-vq76g" (OuterVolumeSpecName: "kube-api-access-vq76g") pod "68217883-a964-4b84-880a-6ac714e5e58e" (UID: "68217883-a964-4b84-880a-6ac714e5e58e"). InnerVolumeSpecName "kube-api-access-vq76g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:28:04 crc kubenswrapper[4771]: I0319 15:28:04.186359 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq76g\" (UniqueName: \"kubernetes.io/projected/68217883-a964-4b84-880a-6ac714e5e58e-kube-api-access-vq76g\") on node \"crc\" DevicePath \"\"" Mar 19 15:28:04 crc kubenswrapper[4771]: I0319 15:28:04.673762 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565568-fwdhr" event={"ID":"68217883-a964-4b84-880a-6ac714e5e58e","Type":"ContainerDied","Data":"d194bb241900cefc91ee9bf1f2a94ce40685b0ab14cc7dd82f590d1bd763cbbb"} Mar 19 15:28:04 crc kubenswrapper[4771]: I0319 15:28:04.673819 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d194bb241900cefc91ee9bf1f2a94ce40685b0ab14cc7dd82f590d1bd763cbbb" Mar 19 15:28:04 crc kubenswrapper[4771]: I0319 15:28:04.673911 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565568-fwdhr" Mar 19 15:28:05 crc kubenswrapper[4771]: I0319 15:28:05.067643 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565562-68tfj"] Mar 19 15:28:05 crc kubenswrapper[4771]: I0319 15:28:05.074320 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565562-68tfj"] Mar 19 15:28:05 crc kubenswrapper[4771]: I0319 15:28:05.521165 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdb11256-816a-4907-9746-56e259e4fd29" path="/var/lib/kubelet/pods/cdb11256-816a-4907-9746-56e259e4fd29/volumes" Mar 19 15:28:42 crc kubenswrapper[4771]: I0319 15:28:42.077621 4771 scope.go:117] "RemoveContainer" containerID="dcc822adc2395f9f73e4d9667375ba7d2e57039a8088293b565abf4cf2474e0b" Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.184635 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-74jv6"] Mar 19 15:29:20 crc kubenswrapper[4771]: E0319 15:29:20.185786 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68217883-a964-4b84-880a-6ac714e5e58e" containerName="oc" Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.185818 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="68217883-a964-4b84-880a-6ac714e5e58e" containerName="oc" Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.190310 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="68217883-a964-4b84-880a-6ac714e5e58e" containerName="oc" Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.190893 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-74jv6" Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.193764 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.194344 4771 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-22gkf" Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.194762 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-9wbzz"] Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.198855 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.202903 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-74jv6"] Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.203073 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-9wbzz" Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.208823 4771 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-xcpkw" Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.213540 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-p56ts"] Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.214319 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-p56ts" Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.216659 4771 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-p6r5v" Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.230278 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-9wbzz"] Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.234312 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-p56ts"] Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.298413 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6v27\" (UniqueName: \"kubernetes.io/projected/d528bd2f-48f9-4d23-b858-99febe63243c-kube-api-access-p6v27\") pod \"cert-manager-webhook-687f57d79b-p56ts\" (UID: \"d528bd2f-48f9-4d23-b858-99febe63243c\") " pod="cert-manager/cert-manager-webhook-687f57d79b-p56ts" Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.298611 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2vzs\" (UniqueName: \"kubernetes.io/projected/0e55dcd2-7a64-4176-a47b-4c6ce9b9f663-kube-api-access-x2vzs\") pod \"cert-manager-858654f9db-9wbzz\" (UID: \"0e55dcd2-7a64-4176-a47b-4c6ce9b9f663\") " pod="cert-manager/cert-manager-858654f9db-9wbzz" Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.298693 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8hwb\" (UniqueName: \"kubernetes.io/projected/e6f2cd45-d12c-4239-a907-c3481ed379d1-kube-api-access-t8hwb\") pod \"cert-manager-cainjector-cf98fcc89-74jv6\" (UID: \"e6f2cd45-d12c-4239-a907-c3481ed379d1\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-74jv6" Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.400570 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8hwb\" (UniqueName: \"kubernetes.io/projected/e6f2cd45-d12c-4239-a907-c3481ed379d1-kube-api-access-t8hwb\") pod \"cert-manager-cainjector-cf98fcc89-74jv6\" (UID: \"e6f2cd45-d12c-4239-a907-c3481ed379d1\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-74jv6" Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.400721 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6v27\" (UniqueName: \"kubernetes.io/projected/d528bd2f-48f9-4d23-b858-99febe63243c-kube-api-access-p6v27\") pod \"cert-manager-webhook-687f57d79b-p56ts\" (UID: \"d528bd2f-48f9-4d23-b858-99febe63243c\") " pod="cert-manager/cert-manager-webhook-687f57d79b-p56ts" Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.400800 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2vzs\" (UniqueName: \"kubernetes.io/projected/0e55dcd2-7a64-4176-a47b-4c6ce9b9f663-kube-api-access-x2vzs\") pod \"cert-manager-858654f9db-9wbzz\" (UID: \"0e55dcd2-7a64-4176-a47b-4c6ce9b9f663\") " pod="cert-manager/cert-manager-858654f9db-9wbzz" Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.421740 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2vzs\" (UniqueName: \"kubernetes.io/projected/0e55dcd2-7a64-4176-a47b-4c6ce9b9f663-kube-api-access-x2vzs\") pod \"cert-manager-858654f9db-9wbzz\" (UID: \"0e55dcd2-7a64-4176-a47b-4c6ce9b9f663\") " pod="cert-manager/cert-manager-858654f9db-9wbzz" Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.424789 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6v27\" (UniqueName: \"kubernetes.io/projected/d528bd2f-48f9-4d23-b858-99febe63243c-kube-api-access-p6v27\") pod \"cert-manager-webhook-687f57d79b-p56ts\" (UID: \"d528bd2f-48f9-4d23-b858-99febe63243c\") " pod="cert-manager/cert-manager-webhook-687f57d79b-p56ts" Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.427641 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8hwb\" (UniqueName: \"kubernetes.io/projected/e6f2cd45-d12c-4239-a907-c3481ed379d1-kube-api-access-t8hwb\") pod \"cert-manager-cainjector-cf98fcc89-74jv6\" (UID: \"e6f2cd45-d12c-4239-a907-c3481ed379d1\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-74jv6" Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.509344 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-74jv6" Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.516294 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-9wbzz" Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.527152 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-p56ts" Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.797179 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-p56ts"] Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.936844 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-9wbzz"] Mar 19 15:29:20 crc kubenswrapper[4771]: W0319 15:29:20.941690 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e55dcd2_7a64_4176_a47b_4c6ce9b9f663.slice/crio-52ba31eb5c1bcc7d469344cab8840a9c1b1ed05acd9c942d7710f2685fb8f957 WatchSource:0}: Error finding container 52ba31eb5c1bcc7d469344cab8840a9c1b1ed05acd9c942d7710f2685fb8f957: Status 404 returned error can't find the container with id 52ba31eb5c1bcc7d469344cab8840a9c1b1ed05acd9c942d7710f2685fb8f957 Mar 19 15:29:20 crc kubenswrapper[4771]: I0319 15:29:20.950039 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-74jv6"] Mar 19 15:29:20 crc kubenswrapper[4771]: W0319 15:29:20.952473 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6f2cd45_d12c_4239_a907_c3481ed379d1.slice/crio-18e8397ebb5aa0ad33a0d9a45e73e5dbc08e0e2a163989af681eccc125a93dc6 WatchSource:0}: Error finding container 18e8397ebb5aa0ad33a0d9a45e73e5dbc08e0e2a163989af681eccc125a93dc6: Status 404 returned error can't find the container with id 18e8397ebb5aa0ad33a0d9a45e73e5dbc08e0e2a163989af681eccc125a93dc6 Mar 19 15:29:21 crc kubenswrapper[4771]: I0319 15:29:21.219794 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-p56ts" event={"ID":"d528bd2f-48f9-4d23-b858-99febe63243c","Type":"ContainerStarted","Data":"9319bbef9a66f336b950e002f027fb0cce067bcdd48c329e7a8b3843ed03742c"} Mar 19 15:29:21 crc kubenswrapper[4771]: I0319 15:29:21.222281 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-74jv6" event={"ID":"e6f2cd45-d12c-4239-a907-c3481ed379d1","Type":"ContainerStarted","Data":"18e8397ebb5aa0ad33a0d9a45e73e5dbc08e0e2a163989af681eccc125a93dc6"} Mar 19 15:29:21 crc kubenswrapper[4771]: I0319 15:29:21.223466 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-9wbzz" event={"ID":"0e55dcd2-7a64-4176-a47b-4c6ce9b9f663","Type":"ContainerStarted","Data":"52ba31eb5c1bcc7d469344cab8840a9c1b1ed05acd9c942d7710f2685fb8f957"} Mar 19 15:29:23 crc kubenswrapper[4771]: I0319 15:29:23.027426 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:29:23 crc kubenswrapper[4771]: I0319 15:29:23.027777 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:29:24 crc kubenswrapper[4771]: I0319 15:29:24.242330 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-9wbzz" event={"ID":"0e55dcd2-7a64-4176-a47b-4c6ce9b9f663","Type":"ContainerStarted","Data":"40e3de6825224ca89f8002d3ab4610b466c7d8be51a6b8c9b89882ed45a4c943"} Mar 19 15:29:24 crc kubenswrapper[4771]: I0319 15:29:24.246404 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-p56ts" event={"ID":"d528bd2f-48f9-4d23-b858-99febe63243c","Type":"ContainerStarted","Data":"f61b52a3aa654b83035a439d5a86b561160ba5dfd242ef64fc1f8875fb8675ba"} Mar 19 15:29:24 crc kubenswrapper[4771]: I0319 15:29:24.246748 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-p56ts" Mar 19 15:29:24 crc kubenswrapper[4771]: I0319 15:29:24.268651 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-9wbzz" podStartSLOduration=1.331642255 podStartE2EDuration="4.268630487s" podCreationTimestamp="2026-03-19 15:29:20 +0000 UTC" firstStartedPulling="2026-03-19 15:29:20.943820535 +0000 UTC m=+820.172441737" lastFinishedPulling="2026-03-19 15:29:23.880808767 +0000 UTC m=+823.109429969" observedRunningTime="2026-03-19 15:29:24.265644952 +0000 UTC m=+823.494266164" watchObservedRunningTime="2026-03-19 15:29:24.268630487 +0000 UTC m=+823.497251689" Mar 19 15:29:24 crc kubenswrapper[4771]: I0319 15:29:24.288198 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-p56ts" podStartSLOduration=1.157499444 podStartE2EDuration="4.288172426s" podCreationTimestamp="2026-03-19 15:29:20 +0000 UTC" firstStartedPulling="2026-03-19 15:29:20.807953243 +0000 UTC m=+820.036574445" lastFinishedPulling="2026-03-19 15:29:23.938626225 +0000 UTC m=+823.167247427" observedRunningTime="2026-03-19 15:29:24.283755566 +0000 UTC m=+823.512376808" watchObservedRunningTime="2026-03-19 15:29:24.288172426 +0000 UTC m=+823.516793638" Mar 19 15:29:28 crc kubenswrapper[4771]: I0319 15:29:28.282547 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-74jv6" event={"ID":"e6f2cd45-d12c-4239-a907-c3481ed379d1","Type":"ContainerStarted","Data":"38417d0e7401fac5ac5f2198a4e765138db8f12289f3c6dd0c6b78cd45c4d004"} Mar 19 15:29:28 crc kubenswrapper[4771]: I0319 15:29:28.302782 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-74jv6" podStartSLOduration=2.045662281 podStartE2EDuration="8.302754699s" podCreationTimestamp="2026-03-19 15:29:20 +0000 UTC" firstStartedPulling="2026-03-19 15:29:20.95444564 +0000 UTC m=+820.183066842" lastFinishedPulling="2026-03-19 15:29:27.211538058 +0000 UTC m=+826.440159260" observedRunningTime="2026-03-19 15:29:28.301895087 +0000 UTC m=+827.530516319" watchObservedRunningTime="2026-03-19 15:29:28.302754699 +0000 UTC m=+827.531375961" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.329060 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-b6zx4"] Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.331181 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="ovn-controller" containerID="cri-o://90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2" gracePeriod=30 Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.331267 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="nbdb" containerID="cri-o://113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000" gracePeriod=30 Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.331296 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702" gracePeriod=30 Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.331353 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="kube-rbac-proxy-node" containerID="cri-o://26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6" gracePeriod=30 Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.331424 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="northd" containerID="cri-o://1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727" gracePeriod=30 Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.331438 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="ovn-acl-logging" containerID="cri-o://3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c" gracePeriod=30 Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.331768 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="sbdb" containerID="cri-o://8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1" gracePeriod=30 Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.385652 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="ovnkube-controller" containerID="cri-o://bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d" gracePeriod=30 Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.529781 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-p56ts" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.680747 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b6zx4_bf31981b-d437-4216-a275-5b566d8c49aa/ovnkube-controller/3.log" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.683589 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b6zx4_bf31981b-d437-4216-a275-5b566d8c49aa/ovn-acl-logging/0.log" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.684111 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b6zx4_bf31981b-d437-4216-a275-5b566d8c49aa/ovn-controller/0.log" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.684543 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.741827 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bw7d2"] Mar 19 15:29:30 crc kubenswrapper[4771]: E0319 15:29:30.742132 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="sbdb" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.742173 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="sbdb" Mar 19 15:29:30 crc kubenswrapper[4771]: E0319 15:29:30.742187 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="ovnkube-controller" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.742194 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="ovnkube-controller" Mar 19 15:29:30 crc kubenswrapper[4771]: E0319 15:29:30.742203 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="ovnkube-controller" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.742211 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="ovnkube-controller" Mar 19 15:29:30 crc kubenswrapper[4771]: E0319 15:29:30.742222 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="nbdb" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.742229 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="nbdb" Mar 19 15:29:30 crc kubenswrapper[4771]: E0319 15:29:30.742262 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="ovnkube-controller" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.742270 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="ovnkube-controller" Mar 19 15:29:30 crc kubenswrapper[4771]: E0319 15:29:30.742279 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="ovn-acl-logging" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.742289 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="ovn-acl-logging" Mar 19 15:29:30 crc kubenswrapper[4771]: E0319 15:29:30.742305 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="ovn-controller" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.742337 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="ovn-controller" Mar 19 15:29:30 crc kubenswrapper[4771]: E0319 15:29:30.742351 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="kube-rbac-proxy-node" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.742359 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="kube-rbac-proxy-node" Mar 19 15:29:30 crc kubenswrapper[4771]: E0319 15:29:30.742367 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.742374 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 15:29:30 crc kubenswrapper[4771]: E0319 15:29:30.742390 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="northd" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.742423 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="northd" Mar 19 15:29:30 crc kubenswrapper[4771]: E0319 15:29:30.742434 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="kubecfg-setup" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.742441 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="kubecfg-setup" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.742597 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="nbdb" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.742610 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="ovn-controller" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.742619 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="ovnkube-controller" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.742627 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="ovnkube-controller" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.742635 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="ovnkube-controller" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.742667 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.742678 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="kube-rbac-proxy-node" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.742693 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="sbdb" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.742702 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="ovn-acl-logging" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.742711 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="northd" Mar 19 15:29:30 crc kubenswrapper[4771]: E0319 15:29:30.742862 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="ovnkube-controller" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.742873 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="ovnkube-controller" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.743061 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="ovnkube-controller" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.743078 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="ovnkube-controller" Mar 19 15:29:30 crc kubenswrapper[4771]: E0319 15:29:30.743252 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="ovnkube-controller" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.743263 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" containerName="ovnkube-controller" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.745396 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.750200 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-systemd-units\") pod \"bf31981b-d437-4216-a275-5b566d8c49aa\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.750249 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk5n5\" (UniqueName: \"kubernetes.io/projected/bf31981b-d437-4216-a275-5b566d8c49aa-kube-api-access-hk5n5\") pod \"bf31981b-d437-4216-a275-5b566d8c49aa\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.750271 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"bf31981b-d437-4216-a275-5b566d8c49aa\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.750303 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bf31981b-d437-4216-a275-5b566d8c49aa-ovnkube-config\") pod \"bf31981b-d437-4216-a275-5b566d8c49aa\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.750329 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-cni-bin\") pod \"bf31981b-d437-4216-a275-5b566d8c49aa\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.750354 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-run-openvswitch\") pod \"bf31981b-d437-4216-a275-5b566d8c49aa\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.750374 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bf31981b-d437-4216-a275-5b566d8c49aa-env-overrides\") pod \"bf31981b-d437-4216-a275-5b566d8c49aa\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.750372 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "bf31981b-d437-4216-a275-5b566d8c49aa" (UID: "bf31981b-d437-4216-a275-5b566d8c49aa"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.750407 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bf31981b-d437-4216-a275-5b566d8c49aa-ovn-node-metrics-cert\") pod \"bf31981b-d437-4216-a275-5b566d8c49aa\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.750410 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "bf31981b-d437-4216-a275-5b566d8c49aa" (UID: "bf31981b-d437-4216-a275-5b566d8c49aa"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.750426 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "bf31981b-d437-4216-a275-5b566d8c49aa" (UID: "bf31981b-d437-4216-a275-5b566d8c49aa"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.750427 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-node-log\") pod \"bf31981b-d437-4216-a275-5b566d8c49aa\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.750459 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-node-log" (OuterVolumeSpecName: "node-log") pod "bf31981b-d437-4216-a275-5b566d8c49aa" (UID: "bf31981b-d437-4216-a275-5b566d8c49aa"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.750493 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "bf31981b-d437-4216-a275-5b566d8c49aa" (UID: "bf31981b-d437-4216-a275-5b566d8c49aa"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.750506 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-log-socket\") pod \"bf31981b-d437-4216-a275-5b566d8c49aa\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.750536 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-run-systemd\") pod \"bf31981b-d437-4216-a275-5b566d8c49aa\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.750558 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-etc-openvswitch\") pod \"bf31981b-d437-4216-a275-5b566d8c49aa\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.750580 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-run-netns\") pod \"bf31981b-d437-4216-a275-5b566d8c49aa\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.750617 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-kubelet\") pod \"bf31981b-d437-4216-a275-5b566d8c49aa\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.750644 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-run-ovn\") pod \"bf31981b-d437-4216-a275-5b566d8c49aa\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.750676 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-cni-netd\") pod \"bf31981b-d437-4216-a275-5b566d8c49aa\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.750708 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-run-ovn-kubernetes\") pod \"bf31981b-d437-4216-a275-5b566d8c49aa\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.750727 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-var-lib-openvswitch\") pod \"bf31981b-d437-4216-a275-5b566d8c49aa\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.750762 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bf31981b-d437-4216-a275-5b566d8c49aa-ovnkube-script-lib\") pod \"bf31981b-d437-4216-a275-5b566d8c49aa\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.750791 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-slash\") pod \"bf31981b-d437-4216-a275-5b566d8c49aa\" (UID: \"bf31981b-d437-4216-a275-5b566d8c49aa\") " Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.750809 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf31981b-d437-4216-a275-5b566d8c49aa-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "bf31981b-d437-4216-a275-5b566d8c49aa" (UID: "bf31981b-d437-4216-a275-5b566d8c49aa"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.751123 4771 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-node-log\") on node \"crc\" DevicePath \"\"" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.751138 4771 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.751152 4771 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.751163 4771 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bf31981b-d437-4216-a275-5b566d8c49aa-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.751174 4771 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.751185 4771 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.751216 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-slash" (OuterVolumeSpecName: "host-slash") pod "bf31981b-d437-4216-a275-5b566d8c49aa" (UID: "bf31981b-d437-4216-a275-5b566d8c49aa"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.751242 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "bf31981b-d437-4216-a275-5b566d8c49aa" (UID: "bf31981b-d437-4216-a275-5b566d8c49aa"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.751264 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "bf31981b-d437-4216-a275-5b566d8c49aa" (UID: "bf31981b-d437-4216-a275-5b566d8c49aa"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.751286 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "bf31981b-d437-4216-a275-5b566d8c49aa" (UID: "bf31981b-d437-4216-a275-5b566d8c49aa"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.751307 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "bf31981b-d437-4216-a275-5b566d8c49aa" (UID: "bf31981b-d437-4216-a275-5b566d8c49aa"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.751328 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "bf31981b-d437-4216-a275-5b566d8c49aa" (UID: "bf31981b-d437-4216-a275-5b566d8c49aa"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.751348 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "bf31981b-d437-4216-a275-5b566d8c49aa" (UID: "bf31981b-d437-4216-a275-5b566d8c49aa"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.751369 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "bf31981b-d437-4216-a275-5b566d8c49aa" (UID: "bf31981b-d437-4216-a275-5b566d8c49aa"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.751437 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-log-socket" (OuterVolumeSpecName: "log-socket") pod "bf31981b-d437-4216-a275-5b566d8c49aa" (UID: "bf31981b-d437-4216-a275-5b566d8c49aa"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.751798 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf31981b-d437-4216-a275-5b566d8c49aa-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "bf31981b-d437-4216-a275-5b566d8c49aa" (UID: "bf31981b-d437-4216-a275-5b566d8c49aa"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.752730 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf31981b-d437-4216-a275-5b566d8c49aa-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "bf31981b-d437-4216-a275-5b566d8c49aa" (UID: "bf31981b-d437-4216-a275-5b566d8c49aa"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.758132 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf31981b-d437-4216-a275-5b566d8c49aa-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "bf31981b-d437-4216-a275-5b566d8c49aa" (UID: "bf31981b-d437-4216-a275-5b566d8c49aa"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.758519 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf31981b-d437-4216-a275-5b566d8c49aa-kube-api-access-hk5n5" (OuterVolumeSpecName: "kube-api-access-hk5n5") pod "bf31981b-d437-4216-a275-5b566d8c49aa" (UID: "bf31981b-d437-4216-a275-5b566d8c49aa"). InnerVolumeSpecName "kube-api-access-hk5n5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.784639 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "bf31981b-d437-4216-a275-5b566d8c49aa" (UID: "bf31981b-d437-4216-a275-5b566d8c49aa"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.852148 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz27n\" (UniqueName: \"kubernetes.io/projected/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-kube-api-access-cz27n\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.852389 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-systemd-units\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.852477 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-host-slash\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.852556 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-host-run-netns\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.852635 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-log-socket\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.852710 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-var-lib-openvswitch\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.852783 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-ovnkube-script-lib\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.852853 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-host-cni-bin\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.852927 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-ovnkube-config\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.853029 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-node-log\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.853110 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-ovn-node-metrics-cert\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.853195 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-host-cni-netd\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.853269 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-etc-openvswitch\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.853344 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-run-ovn\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.853425 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-host-kubelet\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.853506 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-run-systemd\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.853574 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.853652 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-host-run-ovn-kubernetes\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.853730 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-env-overrides\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.853804 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-run-openvswitch\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.853894 4771 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bf31981b-d437-4216-a275-5b566d8c49aa-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.853956 4771 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-slash\") on node \"crc\" DevicePath \"\"" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.854036 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hk5n5\" (UniqueName: \"kubernetes.io/projected/bf31981b-d437-4216-a275-5b566d8c49aa-kube-api-access-hk5n5\") on node \"crc\" DevicePath \"\"" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.854106 4771 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bf31981b-d437-4216-a275-5b566d8c49aa-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.854162 4771 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bf31981b-d437-4216-a275-5b566d8c49aa-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.854213 4771 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-log-socket\") on node \"crc\" DevicePath \"\"" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.854268 4771 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.854323 4771 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.854373 4771 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.854425 4771 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.854479 4771 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.854532 4771 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.854585 4771 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.854634 4771 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf31981b-d437-4216-a275-5b566d8c49aa-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.956235 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-run-systemd\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.956312 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.956370 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-run-systemd\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.956387 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-host-run-ovn-kubernetes\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.956454 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-host-run-ovn-kubernetes\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.956456 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-env-overrides\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.956535 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-run-openvswitch\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.956481 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.956581 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz27n\" (UniqueName: \"kubernetes.io/projected/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-kube-api-access-cz27n\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.956615 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-systemd-units\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.956641 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-run-openvswitch\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.956655 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-host-slash\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.956692 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-host-slash\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.956728 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-host-run-netns\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.956768 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-log-socket\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.956810 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-var-lib-openvswitch\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.956811 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-systemd-units\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.956883 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-var-lib-openvswitch\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.956846 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-host-run-netns\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.956842 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-ovnkube-script-lib\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.956918 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-log-socket\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.956963 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-host-cni-bin\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.957091 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-ovnkube-config\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.957103 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-env-overrides\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.957163 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-host-cni-bin\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.957177 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-node-log\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.957328 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-ovn-node-metrics-cert\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.957242 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-node-log\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.957399 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-host-cni-netd\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.957466 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-etc-openvswitch\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.957569 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-host-cni-netd\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.957634 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-run-ovn\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.957672 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-etc-openvswitch\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.957719 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-ovnkube-config\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.957754 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-host-kubelet\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.957761 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-run-ovn\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.957840 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-host-kubelet\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.958597 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-ovnkube-script-lib\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.968194 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-ovn-node-metrics-cert\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:30 crc kubenswrapper[4771]: I0319 15:29:30.973582 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz27n\" (UniqueName: \"kubernetes.io/projected/bd2e5ea2-2f31-4431-812f-91e2c066ffdf-kube-api-access-cz27n\") pod \"ovnkube-node-bw7d2\" (UID: \"bd2e5ea2-2f31-4431-812f-91e2c066ffdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.061655 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:31 crc kubenswrapper[4771]: W0319 15:29:31.097521 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd2e5ea2_2f31_4431_812f_91e2c066ffdf.slice/crio-c5215528ba9bddf6c7b9b5beaa07b329989a5ef9746af63691e82dc50a809410 WatchSource:0}: Error finding container c5215528ba9bddf6c7b9b5beaa07b329989a5ef9746af63691e82dc50a809410: Status 404 returned error can't find the container with id c5215528ba9bddf6c7b9b5beaa07b329989a5ef9746af63691e82dc50a809410 Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.319369 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9989m_51f8c2de-454d-4b7c-bf30-2f5d12d7088e/kube-multus/2.log" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.320215 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9989m_51f8c2de-454d-4b7c-bf30-2f5d12d7088e/kube-multus/1.log" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.320336 4771 generic.go:334] "Generic (PLEG): container finished" podID="51f8c2de-454d-4b7c-bf30-2f5d12d7088e" containerID="3bd5d8865766ecf282dd1c1331a00385cc5afafe9aeec835b90a86519234b1ab" exitCode=2 Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.320598 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9989m" event={"ID":"51f8c2de-454d-4b7c-bf30-2f5d12d7088e","Type":"ContainerDied","Data":"3bd5d8865766ecf282dd1c1331a00385cc5afafe9aeec835b90a86519234b1ab"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.320702 4771 scope.go:117] "RemoveContainer" containerID="65f7ff3b147b68b53a4ab6e3fc6c7b1b5f1c61d11dc5bfab7b3d92a638fecbb2" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.322363 4771 scope.go:117] "RemoveContainer" containerID="3bd5d8865766ecf282dd1c1331a00385cc5afafe9aeec835b90a86519234b1ab" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.326856 4771 generic.go:334] "Generic (PLEG): container finished" podID="bd2e5ea2-2f31-4431-812f-91e2c066ffdf" containerID="34829d6fc4cab6ddfd0fb89bc1ef244319d61aecc0c71b57684336f9849b36c4" exitCode=0 Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.326930 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" event={"ID":"bd2e5ea2-2f31-4431-812f-91e2c066ffdf","Type":"ContainerDied","Data":"34829d6fc4cab6ddfd0fb89bc1ef244319d61aecc0c71b57684336f9849b36c4"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.327025 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" event={"ID":"bd2e5ea2-2f31-4431-812f-91e2c066ffdf","Type":"ContainerStarted","Data":"c5215528ba9bddf6c7b9b5beaa07b329989a5ef9746af63691e82dc50a809410"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.335695 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b6zx4_bf31981b-d437-4216-a275-5b566d8c49aa/ovnkube-controller/3.log" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.339507 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b6zx4_bf31981b-d437-4216-a275-5b566d8c49aa/ovn-acl-logging/0.log" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340042 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b6zx4_bf31981b-d437-4216-a275-5b566d8c49aa/ovn-controller/0.log" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340347 4771 generic.go:334] "Generic (PLEG): container finished" podID="bf31981b-d437-4216-a275-5b566d8c49aa" containerID="bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d" exitCode=0 Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340370 4771 generic.go:334] "Generic (PLEG): container finished" podID="bf31981b-d437-4216-a275-5b566d8c49aa" containerID="8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1" exitCode=0 Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340378 4771 generic.go:334] "Generic (PLEG): container finished" podID="bf31981b-d437-4216-a275-5b566d8c49aa" containerID="113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000" exitCode=0 Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340387 4771 generic.go:334] "Generic (PLEG): container finished" podID="bf31981b-d437-4216-a275-5b566d8c49aa" containerID="1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727" exitCode=0 Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340397 4771 generic.go:334] "Generic (PLEG): container finished" podID="bf31981b-d437-4216-a275-5b566d8c49aa" containerID="34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702" exitCode=0 Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340405 4771 generic.go:334] "Generic (PLEG): container finished" podID="bf31981b-d437-4216-a275-5b566d8c49aa" containerID="26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6" exitCode=0 Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340412 4771 generic.go:334] "Generic (PLEG): container finished" podID="bf31981b-d437-4216-a275-5b566d8c49aa" containerID="3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c" exitCode=143 Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340419 4771 generic.go:334] "Generic (PLEG): container finished" podID="bf31981b-d437-4216-a275-5b566d8c49aa" containerID="90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2" exitCode=143 Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340440 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" event={"ID":"bf31981b-d437-4216-a275-5b566d8c49aa","Type":"ContainerDied","Data":"bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340466 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" event={"ID":"bf31981b-d437-4216-a275-5b566d8c49aa","Type":"ContainerDied","Data":"8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340477 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" event={"ID":"bf31981b-d437-4216-a275-5b566d8c49aa","Type":"ContainerDied","Data":"113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340486 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" event={"ID":"bf31981b-d437-4216-a275-5b566d8c49aa","Type":"ContainerDied","Data":"1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340496 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" event={"ID":"bf31981b-d437-4216-a275-5b566d8c49aa","Type":"ContainerDied","Data":"34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340505 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" event={"ID":"bf31981b-d437-4216-a275-5b566d8c49aa","Type":"ContainerDied","Data":"26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340516 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340525 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340531 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340536 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340541 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340547 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340553 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340559 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340564 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340570 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340577 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" event={"ID":"bf31981b-d437-4216-a275-5b566d8c49aa","Type":"ContainerDied","Data":"3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340584 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340590 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340595 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340600 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340605 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340610 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340616 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340621 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340625 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340630 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340637 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" event={"ID":"bf31981b-d437-4216-a275-5b566d8c49aa","Type":"ContainerDied","Data":"90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340645 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340653 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340660 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340667 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340708 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340716 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340723 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340750 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340835 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340846 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340857 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" event={"ID":"bf31981b-d437-4216-a275-5b566d8c49aa","Type":"ContainerDied","Data":"69e97e6616d71e73668c2c7097c1536469c45c0d2233f02aa729ec54cc483386"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340871 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340879 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340885 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340890 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340896 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340902 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340908 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340914 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340919 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.340925 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd"} Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.341044 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b6zx4" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.373794 4771 scope.go:117] "RemoveContainer" containerID="bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.407094 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-b6zx4"] Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.410830 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-b6zx4"] Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.412840 4771 scope.go:117] "RemoveContainer" containerID="caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.442030 4771 scope.go:117] "RemoveContainer" containerID="8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.457700 4771 scope.go:117] "RemoveContainer" containerID="113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.507810 4771 scope.go:117] "RemoveContainer" containerID="1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.521261 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf31981b-d437-4216-a275-5b566d8c49aa" path="/var/lib/kubelet/pods/bf31981b-d437-4216-a275-5b566d8c49aa/volumes" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.535265 4771 scope.go:117] "RemoveContainer" containerID="34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.572111 4771 scope.go:117] "RemoveContainer" containerID="26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.592154 4771 scope.go:117] "RemoveContainer" containerID="3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.605250 4771 scope.go:117] "RemoveContainer" containerID="90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.623618 4771 scope.go:117] "RemoveContainer" containerID="6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.648575 4771 scope.go:117] "RemoveContainer" containerID="bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d" Mar 19 15:29:31 crc kubenswrapper[4771]: E0319 15:29:31.649279 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d\": container with ID starting with bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d not found: ID does not exist" containerID="bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.649321 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d"} err="failed to get container status \"bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d\": rpc error: code = NotFound desc = could not find container \"bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d\": container with ID starting with bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.649346 4771 scope.go:117] "RemoveContainer" containerID="caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b" Mar 19 15:29:31 crc kubenswrapper[4771]: E0319 15:29:31.649789 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b\": container with ID starting with caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b not found: ID does not exist" containerID="caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.649873 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b"} err="failed to get container status \"caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b\": rpc error: code = NotFound desc = could not find container \"caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b\": container with ID starting with caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.649946 4771 scope.go:117] "RemoveContainer" containerID="8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1" Mar 19 15:29:31 crc kubenswrapper[4771]: E0319 15:29:31.650328 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\": container with ID starting with 8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1 not found: ID does not exist" containerID="8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.650356 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1"} err="failed to get container status \"8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\": rpc error: code = NotFound desc = could not find container \"8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\": container with ID starting with 8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1 not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.650373 4771 scope.go:117] "RemoveContainer" containerID="113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000" Mar 19 15:29:31 crc kubenswrapper[4771]: E0319 15:29:31.650668 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\": container with ID starting with 113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000 not found: ID does not exist" containerID="113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.650711 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000"} err="failed to get container status \"113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\": rpc error: code = NotFound desc = could not find container \"113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\": container with ID starting with 113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000 not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.650777 4771 scope.go:117] "RemoveContainer" containerID="1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727" Mar 19 15:29:31 crc kubenswrapper[4771]: E0319 15:29:31.651041 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\": container with ID starting with 1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727 not found: ID does not exist" containerID="1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.651062 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727"} err="failed to get container status \"1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\": rpc error: code = NotFound desc = could not find container \"1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\": container with ID starting with 1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727 not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.651094 4771 scope.go:117] "RemoveContainer" containerID="34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702" Mar 19 15:29:31 crc kubenswrapper[4771]: E0319 15:29:31.651461 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\": container with ID starting with 34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702 not found: ID does not exist" containerID="34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.651533 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702"} err="failed to get container status \"34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\": rpc error: code = NotFound desc = could not find container \"34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\": container with ID starting with 34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702 not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.651564 4771 scope.go:117] "RemoveContainer" containerID="26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6" Mar 19 15:29:31 crc kubenswrapper[4771]: E0319 15:29:31.652011 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\": container with ID starting with 26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6 not found: ID does not exist" containerID="26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.652080 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6"} err="failed to get container status \"26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\": rpc error: code = NotFound desc = could not find container \"26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\": container with ID starting with 26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6 not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.652122 4771 scope.go:117] "RemoveContainer" containerID="3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c" Mar 19 15:29:31 crc kubenswrapper[4771]: E0319 15:29:31.652468 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\": container with ID starting with 3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c not found: ID does not exist" containerID="3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.652507 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c"} err="failed to get container status \"3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\": rpc error: code = NotFound desc = could not find container \"3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\": container with ID starting with 3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.652526 4771 scope.go:117] "RemoveContainer" containerID="90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2" Mar 19 15:29:31 crc kubenswrapper[4771]: E0319 15:29:31.652785 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\": container with ID starting with 90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2 not found: ID does not exist" containerID="90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.652822 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2"} err="failed to get container status \"90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\": rpc error: code = NotFound desc = could not find container \"90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\": container with ID starting with 90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2 not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.652840 4771 scope.go:117] "RemoveContainer" containerID="6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd" Mar 19 15:29:31 crc kubenswrapper[4771]: E0319 15:29:31.653153 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\": container with ID starting with 6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd not found: ID does not exist" containerID="6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.653176 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd"} err="failed to get container status \"6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\": rpc error: code = NotFound desc = could not find container \"6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\": container with ID starting with 6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.653192 4771 scope.go:117] "RemoveContainer" containerID="bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.653546 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d"} err="failed to get container status \"bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d\": rpc error: code = NotFound desc = could not find container \"bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d\": container with ID starting with bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.653612 4771 scope.go:117] "RemoveContainer" containerID="caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.653885 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b"} err="failed to get container status \"caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b\": rpc error: code = NotFound desc = could not find container \"caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b\": container with ID starting with caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.653902 4771 scope.go:117] "RemoveContainer" containerID="8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.654234 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1"} err="failed to get container status \"8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\": rpc error: code = NotFound desc = could not find container \"8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\": container with ID starting with 8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1 not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.654313 4771 scope.go:117] "RemoveContainer" containerID="113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.654602 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000"} err="failed to get container status \"113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\": rpc error: code = NotFound desc = could not find container \"113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\": container with ID starting with 113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000 not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.654630 4771 scope.go:117] "RemoveContainer" containerID="1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.654852 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727"} err="failed to get container status \"1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\": rpc error: code = NotFound desc = could not find container \"1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\": container with ID starting with 1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727 not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.654876 4771 scope.go:117] "RemoveContainer" containerID="34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.655060 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702"} err="failed to get container status \"34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\": rpc error: code = NotFound desc = could not find container \"34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\": container with ID starting with 34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702 not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.655082 4771 scope.go:117] "RemoveContainer" containerID="26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.655258 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6"} err="failed to get container status \"26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\": rpc error: code = NotFound desc = could not find container \"26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\": container with ID starting with 26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6 not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.655279 4771 scope.go:117] "RemoveContainer" containerID="3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.655435 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c"} err="failed to get container status \"3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\": rpc error: code = NotFound desc = could not find container \"3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\": container with ID starting with 3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.655454 4771 scope.go:117] "RemoveContainer" containerID="90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.655613 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2"} err="failed to get container status \"90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\": rpc error: code = NotFound desc = could not find container \"90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\": container with ID starting with 90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2 not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.655625 4771 scope.go:117] "RemoveContainer" containerID="6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.655796 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd"} err="failed to get container status \"6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\": rpc error: code = NotFound desc = could not find container \"6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\": container with ID starting with 6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.655809 4771 scope.go:117] "RemoveContainer" containerID="bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.655993 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d"} err="failed to get container status \"bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d\": rpc error: code = NotFound desc = could not find container \"bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d\": container with ID starting with bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.656018 4771 scope.go:117] "RemoveContainer" containerID="caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.656531 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b"} err="failed to get container status \"caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b\": rpc error: code = NotFound desc = could not find container \"caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b\": container with ID starting with caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.656552 4771 scope.go:117] "RemoveContainer" containerID="8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.656753 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1"} err="failed to get container status \"8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\": rpc error: code = NotFound desc = could not find container \"8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\": container with ID starting with 8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1 not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.656771 4771 scope.go:117] "RemoveContainer" containerID="113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.656977 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000"} err="failed to get container status \"113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\": rpc error: code = NotFound desc = could not find container \"113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\": container with ID starting with 113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000 not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.657013 4771 scope.go:117] "RemoveContainer" containerID="1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.657430 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727"} err="failed to get container status \"1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\": rpc error: code = NotFound desc = could not find container \"1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\": container with ID starting with 1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727 not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.657477 4771 scope.go:117] "RemoveContainer" containerID="34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.657735 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702"} err="failed to get container status \"34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\": rpc error: code = NotFound desc = could not find container \"34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\": container with ID starting with 34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702 not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.657764 4771 scope.go:117] "RemoveContainer" containerID="26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.658046 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6"} err="failed to get container status \"26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\": rpc error: code = NotFound desc = could not find container \"26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\": container with ID starting with 26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6 not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.658070 4771 scope.go:117] "RemoveContainer" containerID="3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.658445 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c"} err="failed to get container status \"3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\": rpc error: code = NotFound desc = could not find container \"3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\": container with ID starting with 3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.658503 4771 scope.go:117] "RemoveContainer" containerID="90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.658907 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2"} err="failed to get container status \"90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\": rpc error: code = NotFound desc = could not find container \"90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\": container with ID starting with 90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2 not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.658930 4771 scope.go:117] "RemoveContainer" containerID="6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.659238 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd"} err="failed to get container status \"6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\": rpc error: code = NotFound desc = could not find container \"6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\": container with ID starting with 6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.659267 4771 scope.go:117] "RemoveContainer" containerID="bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.659608 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d"} err="failed to get container status \"bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d\": rpc error: code = NotFound desc = could not find container \"bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d\": container with ID starting with bdb316bdf7b72c5e0ac88cf01b85c6a5f73007deaddc235fc8c89eb5e723681d not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.659687 4771 scope.go:117] "RemoveContainer" containerID="caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.660014 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b"} err="failed to get container status \"caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b\": rpc error: code = NotFound desc = could not find container \"caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b\": container with ID starting with caafb41bb0a748ef6bb0e7d82429aa15b4894cc34de9b31f9be8a274b808312b not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.660044 4771 scope.go:117] "RemoveContainer" containerID="8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.660394 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1"} err="failed to get container status \"8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\": rpc error: code = NotFound desc = could not find container \"8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1\": container with ID starting with 8bb190e5751721861ae4910e66679402d0dcd0c6985156be75e9eba0ebe17bb1 not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.660424 4771 scope.go:117] "RemoveContainer" containerID="113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.660756 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000"} err="failed to get container status \"113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\": rpc error: code = NotFound desc = could not find container \"113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000\": container with ID starting with 113c3fe58ea53c8242ff5f682907ac8d16b355278c2b3bcba14d7b5f7f2af000 not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.660794 4771 scope.go:117] "RemoveContainer" containerID="1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.661163 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727"} err="failed to get container status \"1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\": rpc error: code = NotFound desc = could not find container \"1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727\": container with ID starting with 1d299f014a29edb3b061de7139cd10dbc4a79a2bfffc8c1dcaf3139a02fa1727 not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.661187 4771 scope.go:117] "RemoveContainer" containerID="34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.661398 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702"} err="failed to get container status \"34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\": rpc error: code = NotFound desc = could not find container \"34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702\": container with ID starting with 34eb5a3796e66058c48ebdfd0167e11e8d02c7662deaa227be4a56bc7537f702 not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.661427 4771 scope.go:117] "RemoveContainer" containerID="26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.661622 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6"} err="failed to get container status \"26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\": rpc error: code = NotFound desc = could not find container \"26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6\": container with ID starting with 26395a21f55388f1f7430536737012477a52b41116890247a6b6d48592a3bfe6 not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.661646 4771 scope.go:117] "RemoveContainer" containerID="3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.661853 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c"} err="failed to get container status \"3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\": rpc error: code = NotFound desc = could not find container \"3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c\": container with ID starting with 3fee607ab2477cdb418c841a6ed285af72ffba6051ee8f9499f0bb47e701fc1c not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.661879 4771 scope.go:117] "RemoveContainer" containerID="90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.662174 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2"} err="failed to get container status \"90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\": rpc error: code = NotFound desc = could not find container \"90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2\": container with ID starting with 90917dd7251a9795e93a25afef82b0ab458587029262f707c18cc599c114cab2 not found: ID does not exist" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.662209 4771 scope.go:117] "RemoveContainer" containerID="6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd" Mar 19 15:29:31 crc kubenswrapper[4771]: I0319 15:29:31.662527 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd"} err="failed to get container status \"6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\": rpc error: code = NotFound desc = could not find container \"6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd\": container with ID starting with 6b4f7cc184927fb8987b44f6523bd30b310d222bb5fc257e16cf3e9f94f567cd not found: ID does not exist" Mar 19 15:29:32 crc kubenswrapper[4771]: I0319 15:29:32.348779 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9989m_51f8c2de-454d-4b7c-bf30-2f5d12d7088e/kube-multus/2.log" Mar 19 15:29:32 crc kubenswrapper[4771]: I0319 15:29:32.349112 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9989m" event={"ID":"51f8c2de-454d-4b7c-bf30-2f5d12d7088e","Type":"ContainerStarted","Data":"9202b8cdb6abde8c41eca7113d2b4cbff2895e2a99e228abe98b117af9d053f7"} Mar 19 15:29:32 crc kubenswrapper[4771]: I0319 15:29:32.352348 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" event={"ID":"bd2e5ea2-2f31-4431-812f-91e2c066ffdf","Type":"ContainerStarted","Data":"a7f353c79dd7bae82be3b0bf296a30e5c4e88dc6e6ffbf9cc98b8c8bfb825196"} Mar 19 15:29:32 crc kubenswrapper[4771]: I0319 15:29:32.352379 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" event={"ID":"bd2e5ea2-2f31-4431-812f-91e2c066ffdf","Type":"ContainerStarted","Data":"cb5b24b10e3e037b846a80f2be92516cc79d017fd832fd5b1754c80fbde78ffd"} Mar 19 15:29:32 crc kubenswrapper[4771]: I0319 15:29:32.352394 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" event={"ID":"bd2e5ea2-2f31-4431-812f-91e2c066ffdf","Type":"ContainerStarted","Data":"9d31bb1932b443435241a7eedbbd7f1d45bad6f2ec3198e192635ce79a829aed"} Mar 19 15:29:32 crc kubenswrapper[4771]: I0319 15:29:32.352406 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" event={"ID":"bd2e5ea2-2f31-4431-812f-91e2c066ffdf","Type":"ContainerStarted","Data":"0f537d1649f16be2a691b20e5b045d9b2aa9e2d2e6b28532650fc2dc54fa8bc8"} Mar 19 15:29:32 crc kubenswrapper[4771]: I0319 15:29:32.352417 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" event={"ID":"bd2e5ea2-2f31-4431-812f-91e2c066ffdf","Type":"ContainerStarted","Data":"67ea7572ff4853426c34113daecf0d8d5e1a341442b4c18ca34afb68b6db4cd6"} Mar 19 15:29:33 crc kubenswrapper[4771]: I0319 15:29:33.368043 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" event={"ID":"bd2e5ea2-2f31-4431-812f-91e2c066ffdf","Type":"ContainerStarted","Data":"df4f03cbecfea6b5a9face38a2e72dfa68ade218cef4a7c484f12cf6886982b8"} Mar 19 15:29:35 crc kubenswrapper[4771]: I0319 15:29:35.386715 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" event={"ID":"bd2e5ea2-2f31-4431-812f-91e2c066ffdf","Type":"ContainerStarted","Data":"6a671ffe53a1377f484b36e397f5a3ebbf96778dd06a775563e33a24f50ea98b"} Mar 19 15:29:37 crc kubenswrapper[4771]: I0319 15:29:37.407880 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" event={"ID":"bd2e5ea2-2f31-4431-812f-91e2c066ffdf","Type":"ContainerStarted","Data":"2bc81cd5331b5e65d9f72d1d19e52d5521cd5e1b692100dd418418a0705c6592"} Mar 19 15:29:37 crc kubenswrapper[4771]: I0319 15:29:37.408551 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:37 crc kubenswrapper[4771]: I0319 15:29:37.408570 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:37 crc kubenswrapper[4771]: I0319 15:29:37.408583 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:37 crc kubenswrapper[4771]: I0319 15:29:37.456077 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" podStartSLOduration=7.456060388 podStartE2EDuration="7.456060388s" podCreationTimestamp="2026-03-19 15:29:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:29:37.452796506 +0000 UTC m=+836.681417708" watchObservedRunningTime="2026-03-19 15:29:37.456060388 +0000 UTC m=+836.684681590" Mar 19 15:29:37 crc kubenswrapper[4771]: I0319 15:29:37.460289 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:37 crc kubenswrapper[4771]: I0319 15:29:37.476471 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:29:53 crc kubenswrapper[4771]: I0319 15:29:53.027519 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:29:53 crc kubenswrapper[4771]: I0319 15:29:53.028355 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:30:00 crc kubenswrapper[4771]: I0319 15:30:00.486611 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565570-b2bhb"] Mar 19 15:30:00 crc kubenswrapper[4771]: I0319 15:30:00.489478 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565570-b2bhb" Mar 19 15:30:00 crc kubenswrapper[4771]: I0319 15:30:00.490560 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565570-jfqpb"] Mar 19 15:30:00 crc kubenswrapper[4771]: I0319 15:30:00.491873 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 15:30:00 crc kubenswrapper[4771]: I0319 15:30:00.492160 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 15:30:00 crc kubenswrapper[4771]: I0319 15:30:00.494619 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565570-jfqpb" Mar 19 15:30:00 crc kubenswrapper[4771]: I0319 15:30:00.497742 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 15:30:00 crc kubenswrapper[4771]: I0319 15:30:00.498690 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 15:30:00 crc kubenswrapper[4771]: I0319 15:30:00.499382 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k42k7" Mar 19 15:30:00 crc kubenswrapper[4771]: I0319 15:30:00.514390 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565570-b2bhb"] Mar 19 15:30:00 crc kubenswrapper[4771]: I0319 15:30:00.518204 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565570-jfqpb"] Mar 19 15:30:00 crc kubenswrapper[4771]: I0319 15:30:00.522371 4771 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 15:30:00 crc kubenswrapper[4771]: I0319 15:30:00.658938 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e0be259c-8b16-4222-b28a-c438e82ec481-secret-volume\") pod \"collect-profiles-29565570-b2bhb\" (UID: \"e0be259c-8b16-4222-b28a-c438e82ec481\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565570-b2bhb" Mar 19 15:30:00 crc kubenswrapper[4771]: I0319 15:30:00.659357 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0be259c-8b16-4222-b28a-c438e82ec481-config-volume\") pod \"collect-profiles-29565570-b2bhb\" (UID: \"e0be259c-8b16-4222-b28a-c438e82ec481\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565570-b2bhb" Mar 19 15:30:00 crc kubenswrapper[4771]: I0319 15:30:00.659451 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhjjk\" (UniqueName: \"kubernetes.io/projected/e0be259c-8b16-4222-b28a-c438e82ec481-kube-api-access-vhjjk\") pod \"collect-profiles-29565570-b2bhb\" (UID: \"e0be259c-8b16-4222-b28a-c438e82ec481\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565570-b2bhb" Mar 19 15:30:00 crc kubenswrapper[4771]: I0319 15:30:00.659485 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lknq8\" (UniqueName: \"kubernetes.io/projected/a5faa2b3-fd6c-4380-b2a6-70749e033b35-kube-api-access-lknq8\") pod \"auto-csr-approver-29565570-jfqpb\" (UID: \"a5faa2b3-fd6c-4380-b2a6-70749e033b35\") " pod="openshift-infra/auto-csr-approver-29565570-jfqpb" Mar 19 15:30:00 crc kubenswrapper[4771]: I0319 15:30:00.760681 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0be259c-8b16-4222-b28a-c438e82ec481-config-volume\") pod \"collect-profiles-29565570-b2bhb\" (UID: \"e0be259c-8b16-4222-b28a-c438e82ec481\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565570-b2bhb" Mar 19 15:30:00 crc kubenswrapper[4771]: I0319 15:30:00.760794 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhjjk\" (UniqueName: \"kubernetes.io/projected/e0be259c-8b16-4222-b28a-c438e82ec481-kube-api-access-vhjjk\") pod \"collect-profiles-29565570-b2bhb\" (UID: \"e0be259c-8b16-4222-b28a-c438e82ec481\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565570-b2bhb" Mar 19 15:30:00 crc kubenswrapper[4771]: I0319 15:30:00.760847 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lknq8\" (UniqueName: \"kubernetes.io/projected/a5faa2b3-fd6c-4380-b2a6-70749e033b35-kube-api-access-lknq8\") pod \"auto-csr-approver-29565570-jfqpb\" (UID: \"a5faa2b3-fd6c-4380-b2a6-70749e033b35\") " pod="openshift-infra/auto-csr-approver-29565570-jfqpb" Mar 19 15:30:00 crc kubenswrapper[4771]: I0319 15:30:00.760886 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e0be259c-8b16-4222-b28a-c438e82ec481-secret-volume\") pod \"collect-profiles-29565570-b2bhb\" (UID: \"e0be259c-8b16-4222-b28a-c438e82ec481\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565570-b2bhb" Mar 19 15:30:00 crc kubenswrapper[4771]: I0319 15:30:00.762115 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0be259c-8b16-4222-b28a-c438e82ec481-config-volume\") pod \"collect-profiles-29565570-b2bhb\" (UID: \"e0be259c-8b16-4222-b28a-c438e82ec481\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565570-b2bhb" Mar 19 15:30:00 crc kubenswrapper[4771]: I0319 15:30:00.770214 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e0be259c-8b16-4222-b28a-c438e82ec481-secret-volume\") pod \"collect-profiles-29565570-b2bhb\" (UID: \"e0be259c-8b16-4222-b28a-c438e82ec481\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565570-b2bhb" Mar 19 15:30:00 crc kubenswrapper[4771]: I0319 15:30:00.779043 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhjjk\" (UniqueName: \"kubernetes.io/projected/e0be259c-8b16-4222-b28a-c438e82ec481-kube-api-access-vhjjk\") pod \"collect-profiles-29565570-b2bhb\" (UID: \"e0be259c-8b16-4222-b28a-c438e82ec481\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565570-b2bhb" Mar 19 15:30:00 crc kubenswrapper[4771]: I0319 15:30:00.779397 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lknq8\" (UniqueName: \"kubernetes.io/projected/a5faa2b3-fd6c-4380-b2a6-70749e033b35-kube-api-access-lknq8\") pod \"auto-csr-approver-29565570-jfqpb\" (UID: \"a5faa2b3-fd6c-4380-b2a6-70749e033b35\") " pod="openshift-infra/auto-csr-approver-29565570-jfqpb" Mar 19 15:30:00 crc kubenswrapper[4771]: I0319 15:30:00.830810 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565570-b2bhb" Mar 19 15:30:00 crc kubenswrapper[4771]: I0319 15:30:00.842779 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565570-jfqpb" Mar 19 15:30:01 crc kubenswrapper[4771]: I0319 15:30:01.055531 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565570-b2bhb"] Mar 19 15:30:01 crc kubenswrapper[4771]: I0319 15:30:01.089325 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bw7d2" Mar 19 15:30:01 crc kubenswrapper[4771]: I0319 15:30:01.302257 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565570-jfqpb"] Mar 19 15:30:01 crc kubenswrapper[4771]: I0319 15:30:01.496254 4771 generic.go:334] "Generic (PLEG): container finished" podID="e0be259c-8b16-4222-b28a-c438e82ec481" containerID="f9c2b367acb66060a4921ddbb0cc2901d1095b745459741b05d19b660495980e" exitCode=0 Mar 19 15:30:01 crc kubenswrapper[4771]: I0319 15:30:01.496310 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565570-b2bhb" event={"ID":"e0be259c-8b16-4222-b28a-c438e82ec481","Type":"ContainerDied","Data":"f9c2b367acb66060a4921ddbb0cc2901d1095b745459741b05d19b660495980e"} Mar 19 15:30:01 crc kubenswrapper[4771]: I0319 15:30:01.496367 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565570-b2bhb" event={"ID":"e0be259c-8b16-4222-b28a-c438e82ec481","Type":"ContainerStarted","Data":"efb71966b6b2ea136df114fb0893926735b6c4ec19fac73b895cdfdbe8f64642"} Mar 19 15:30:01 crc kubenswrapper[4771]: I0319 15:30:01.497899 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565570-jfqpb" event={"ID":"a5faa2b3-fd6c-4380-b2a6-70749e033b35","Type":"ContainerStarted","Data":"a5653e4428208817363fe2134ae907175aafe3702f5ca15121c410e14f320c37"} Mar 19 15:30:02 crc kubenswrapper[4771]: I0319 15:30:02.816571 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565570-b2bhb" Mar 19 15:30:02 crc kubenswrapper[4771]: I0319 15:30:02.988853 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e0be259c-8b16-4222-b28a-c438e82ec481-secret-volume\") pod \"e0be259c-8b16-4222-b28a-c438e82ec481\" (UID: \"e0be259c-8b16-4222-b28a-c438e82ec481\") " Mar 19 15:30:02 crc kubenswrapper[4771]: I0319 15:30:02.989040 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0be259c-8b16-4222-b28a-c438e82ec481-config-volume\") pod \"e0be259c-8b16-4222-b28a-c438e82ec481\" (UID: \"e0be259c-8b16-4222-b28a-c438e82ec481\") " Mar 19 15:30:02 crc kubenswrapper[4771]: I0319 15:30:02.989123 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhjjk\" (UniqueName: \"kubernetes.io/projected/e0be259c-8b16-4222-b28a-c438e82ec481-kube-api-access-vhjjk\") pod \"e0be259c-8b16-4222-b28a-c438e82ec481\" (UID: \"e0be259c-8b16-4222-b28a-c438e82ec481\") " Mar 19 15:30:02 crc kubenswrapper[4771]: I0319 15:30:02.989815 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0be259c-8b16-4222-b28a-c438e82ec481-config-volume" (OuterVolumeSpecName: "config-volume") pod "e0be259c-8b16-4222-b28a-c438e82ec481" (UID: "e0be259c-8b16-4222-b28a-c438e82ec481"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:30:02 crc kubenswrapper[4771]: I0319 15:30:02.994695 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0be259c-8b16-4222-b28a-c438e82ec481-kube-api-access-vhjjk" (OuterVolumeSpecName: "kube-api-access-vhjjk") pod "e0be259c-8b16-4222-b28a-c438e82ec481" (UID: "e0be259c-8b16-4222-b28a-c438e82ec481"). InnerVolumeSpecName "kube-api-access-vhjjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:30:02 crc kubenswrapper[4771]: I0319 15:30:02.995346 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0be259c-8b16-4222-b28a-c438e82ec481-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e0be259c-8b16-4222-b28a-c438e82ec481" (UID: "e0be259c-8b16-4222-b28a-c438e82ec481"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:30:03 crc kubenswrapper[4771]: I0319 15:30:03.091473 4771 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e0be259c-8b16-4222-b28a-c438e82ec481-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 15:30:03 crc kubenswrapper[4771]: I0319 15:30:03.091523 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0be259c-8b16-4222-b28a-c438e82ec481-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 15:30:03 crc kubenswrapper[4771]: I0319 15:30:03.091545 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhjjk\" (UniqueName: \"kubernetes.io/projected/e0be259c-8b16-4222-b28a-c438e82ec481-kube-api-access-vhjjk\") on node \"crc\" DevicePath \"\"" Mar 19 15:30:03 crc kubenswrapper[4771]: I0319 15:30:03.343978 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pgjm5"] Mar 19 15:30:03 crc kubenswrapper[4771]: E0319 15:30:03.344302 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0be259c-8b16-4222-b28a-c438e82ec481" containerName="collect-profiles" Mar 19 15:30:03 crc kubenswrapper[4771]: I0319 15:30:03.344322 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0be259c-8b16-4222-b28a-c438e82ec481" containerName="collect-profiles" Mar 19 15:30:03 crc kubenswrapper[4771]: I0319 15:30:03.344487 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0be259c-8b16-4222-b28a-c438e82ec481" containerName="collect-profiles" Mar 19 15:30:03 crc kubenswrapper[4771]: I0319 15:30:03.345825 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pgjm5" Mar 19 15:30:03 crc kubenswrapper[4771]: I0319 15:30:03.358330 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pgjm5"] Mar 19 15:30:03 crc kubenswrapper[4771]: I0319 15:30:03.496483 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf0c011d-e4d5-44ea-9abb-7b5dbd903634-utilities\") pod \"community-operators-pgjm5\" (UID: \"bf0c011d-e4d5-44ea-9abb-7b5dbd903634\") " pod="openshift-marketplace/community-operators-pgjm5" Mar 19 15:30:03 crc kubenswrapper[4771]: I0319 15:30:03.496542 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf0c011d-e4d5-44ea-9abb-7b5dbd903634-catalog-content\") pod \"community-operators-pgjm5\" (UID: \"bf0c011d-e4d5-44ea-9abb-7b5dbd903634\") " pod="openshift-marketplace/community-operators-pgjm5" Mar 19 15:30:03 crc kubenswrapper[4771]: I0319 15:30:03.496567 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvk6j\" (UniqueName: \"kubernetes.io/projected/bf0c011d-e4d5-44ea-9abb-7b5dbd903634-kube-api-access-kvk6j\") pod \"community-operators-pgjm5\" (UID: \"bf0c011d-e4d5-44ea-9abb-7b5dbd903634\") " pod="openshift-marketplace/community-operators-pgjm5" Mar 19 15:30:03 crc kubenswrapper[4771]: I0319 15:30:03.511928 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565570-b2bhb" Mar 19 15:30:03 crc kubenswrapper[4771]: I0319 15:30:03.517688 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565570-b2bhb" event={"ID":"e0be259c-8b16-4222-b28a-c438e82ec481","Type":"ContainerDied","Data":"efb71966b6b2ea136df114fb0893926735b6c4ec19fac73b895cdfdbe8f64642"} Mar 19 15:30:03 crc kubenswrapper[4771]: I0319 15:30:03.517752 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efb71966b6b2ea136df114fb0893926735b6c4ec19fac73b895cdfdbe8f64642" Mar 19 15:30:03 crc kubenswrapper[4771]: I0319 15:30:03.598209 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf0c011d-e4d5-44ea-9abb-7b5dbd903634-utilities\") pod \"community-operators-pgjm5\" (UID: \"bf0c011d-e4d5-44ea-9abb-7b5dbd903634\") " pod="openshift-marketplace/community-operators-pgjm5" Mar 19 15:30:03 crc kubenswrapper[4771]: I0319 15:30:03.598253 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf0c011d-e4d5-44ea-9abb-7b5dbd903634-catalog-content\") pod \"community-operators-pgjm5\" (UID: \"bf0c011d-e4d5-44ea-9abb-7b5dbd903634\") " pod="openshift-marketplace/community-operators-pgjm5" Mar 19 15:30:03 crc kubenswrapper[4771]: I0319 15:30:03.598508 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvk6j\" (UniqueName: \"kubernetes.io/projected/bf0c011d-e4d5-44ea-9abb-7b5dbd903634-kube-api-access-kvk6j\") pod \"community-operators-pgjm5\" (UID: \"bf0c011d-e4d5-44ea-9abb-7b5dbd903634\") " pod="openshift-marketplace/community-operators-pgjm5" Mar 19 15:30:03 crc kubenswrapper[4771]: I0319 15:30:03.598716 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf0c011d-e4d5-44ea-9abb-7b5dbd903634-catalog-content\") pod \"community-operators-pgjm5\" (UID: \"bf0c011d-e4d5-44ea-9abb-7b5dbd903634\") " pod="openshift-marketplace/community-operators-pgjm5" Mar 19 15:30:03 crc kubenswrapper[4771]: I0319 15:30:03.598777 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf0c011d-e4d5-44ea-9abb-7b5dbd903634-utilities\") pod \"community-operators-pgjm5\" (UID: \"bf0c011d-e4d5-44ea-9abb-7b5dbd903634\") " pod="openshift-marketplace/community-operators-pgjm5" Mar 19 15:30:03 crc kubenswrapper[4771]: I0319 15:30:03.621759 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvk6j\" (UniqueName: \"kubernetes.io/projected/bf0c011d-e4d5-44ea-9abb-7b5dbd903634-kube-api-access-kvk6j\") pod \"community-operators-pgjm5\" (UID: \"bf0c011d-e4d5-44ea-9abb-7b5dbd903634\") " pod="openshift-marketplace/community-operators-pgjm5" Mar 19 15:30:03 crc kubenswrapper[4771]: I0319 15:30:03.731096 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pgjm5" Mar 19 15:30:04 crc kubenswrapper[4771]: I0319 15:30:04.024581 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pgjm5"] Mar 19 15:30:04 crc kubenswrapper[4771]: W0319 15:30:04.031621 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf0c011d_e4d5_44ea_9abb_7b5dbd903634.slice/crio-ec6a7c9aae555956adba47dcceffe2930b4e18aef96aab45e6e2807bd4112552 WatchSource:0}: Error finding container ec6a7c9aae555956adba47dcceffe2930b4e18aef96aab45e6e2807bd4112552: Status 404 returned error can't find the container with id ec6a7c9aae555956adba47dcceffe2930b4e18aef96aab45e6e2807bd4112552 Mar 19 15:30:04 crc kubenswrapper[4771]: I0319 15:30:04.522094 4771 generic.go:334] "Generic (PLEG): container finished" podID="bf0c011d-e4d5-44ea-9abb-7b5dbd903634" containerID="021bfb2a39276ce2f416d0b39572008eac04cf48cd9a12dd6329845e83917880" exitCode=0 Mar 19 15:30:04 crc kubenswrapper[4771]: I0319 15:30:04.522213 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgjm5" event={"ID":"bf0c011d-e4d5-44ea-9abb-7b5dbd903634","Type":"ContainerDied","Data":"021bfb2a39276ce2f416d0b39572008eac04cf48cd9a12dd6329845e83917880"} Mar 19 15:30:04 crc kubenswrapper[4771]: I0319 15:30:04.522655 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgjm5" event={"ID":"bf0c011d-e4d5-44ea-9abb-7b5dbd903634","Type":"ContainerStarted","Data":"ec6a7c9aae555956adba47dcceffe2930b4e18aef96aab45e6e2807bd4112552"} Mar 19 15:30:06 crc kubenswrapper[4771]: I0319 15:30:06.539576 4771 generic.go:334] "Generic (PLEG): container finished" podID="bf0c011d-e4d5-44ea-9abb-7b5dbd903634" containerID="66e93cb93fe4f1c6ec61b5d27f3adbdf8f4ec0e6397eb4ed4f7ef7cfc38f0fc8" exitCode=0 Mar 19 15:30:06 crc kubenswrapper[4771]: I0319 15:30:06.539686 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgjm5" event={"ID":"bf0c011d-e4d5-44ea-9abb-7b5dbd903634","Type":"ContainerDied","Data":"66e93cb93fe4f1c6ec61b5d27f3adbdf8f4ec0e6397eb4ed4f7ef7cfc38f0fc8"} Mar 19 15:30:08 crc kubenswrapper[4771]: I0319 15:30:08.563969 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgjm5" event={"ID":"bf0c011d-e4d5-44ea-9abb-7b5dbd903634","Type":"ContainerStarted","Data":"4c05fa229b646a105fdce8e61434ecbfd6b0a04dc2484ae335e787ea97825b9a"} Mar 19 15:30:08 crc kubenswrapper[4771]: I0319 15:30:08.565632 4771 generic.go:334] "Generic (PLEG): container finished" podID="a5faa2b3-fd6c-4380-b2a6-70749e033b35" containerID="de583627c27a36e50e86e60a2d6e637578d599acc73b71c3837fc8ca21e2d5f0" exitCode=0 Mar 19 15:30:08 crc kubenswrapper[4771]: I0319 15:30:08.565675 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565570-jfqpb" event={"ID":"a5faa2b3-fd6c-4380-b2a6-70749e033b35","Type":"ContainerDied","Data":"de583627c27a36e50e86e60a2d6e637578d599acc73b71c3837fc8ca21e2d5f0"} Mar 19 15:30:08 crc kubenswrapper[4771]: I0319 15:30:08.583042 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pgjm5" podStartSLOduration=2.66470558 podStartE2EDuration="5.583007123s" podCreationTimestamp="2026-03-19 15:30:03 +0000 UTC" firstStartedPulling="2026-03-19 15:30:04.525614891 +0000 UTC m=+863.754236123" lastFinishedPulling="2026-03-19 15:30:07.443916454 +0000 UTC m=+866.672537666" observedRunningTime="2026-03-19 15:30:08.580474499 +0000 UTC m=+867.809095721" watchObservedRunningTime="2026-03-19 15:30:08.583007123 +0000 UTC m=+867.811628335" Mar 19 15:30:09 crc kubenswrapper[4771]: I0319 15:30:09.551900 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2"] Mar 19 15:30:09 crc kubenswrapper[4771]: I0319 15:30:09.553474 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2" Mar 19 15:30:09 crc kubenswrapper[4771]: I0319 15:30:09.556810 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 19 15:30:09 crc kubenswrapper[4771]: I0319 15:30:09.565172 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2"] Mar 19 15:30:09 crc kubenswrapper[4771]: I0319 15:30:09.677065 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2\" (UID: \"1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2" Mar 19 15:30:09 crc kubenswrapper[4771]: I0319 15:30:09.677251 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2\" (UID: \"1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2" Mar 19 15:30:09 crc kubenswrapper[4771]: I0319 15:30:09.677353 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmvbq\" (UniqueName: \"kubernetes.io/projected/1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d-kube-api-access-lmvbq\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2\" (UID: \"1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2" Mar 19 15:30:09 crc kubenswrapper[4771]: I0319 15:30:09.779507 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2\" (UID: \"1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2" Mar 19 15:30:09 crc kubenswrapper[4771]: I0319 15:30:09.779771 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2\" (UID: \"1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2" Mar 19 15:30:09 crc kubenswrapper[4771]: I0319 15:30:09.779853 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmvbq\" (UniqueName: \"kubernetes.io/projected/1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d-kube-api-access-lmvbq\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2\" (UID: \"1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2" Mar 19 15:30:09 crc kubenswrapper[4771]: I0319 15:30:09.779979 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2\" (UID: \"1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2" Mar 19 15:30:09 crc kubenswrapper[4771]: I0319 15:30:09.780316 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2\" (UID: \"1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2" Mar 19 15:30:09 crc kubenswrapper[4771]: I0319 15:30:09.801098 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmvbq\" (UniqueName: \"kubernetes.io/projected/1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d-kube-api-access-lmvbq\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2\" (UID: \"1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2" Mar 19 15:30:09 crc kubenswrapper[4771]: I0319 15:30:09.854207 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565570-jfqpb" Mar 19 15:30:09 crc kubenswrapper[4771]: I0319 15:30:09.879645 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2" Mar 19 15:30:09 crc kubenswrapper[4771]: I0319 15:30:09.982138 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lknq8\" (UniqueName: \"kubernetes.io/projected/a5faa2b3-fd6c-4380-b2a6-70749e033b35-kube-api-access-lknq8\") pod \"a5faa2b3-fd6c-4380-b2a6-70749e033b35\" (UID: \"a5faa2b3-fd6c-4380-b2a6-70749e033b35\") " Mar 19 15:30:09 crc kubenswrapper[4771]: I0319 15:30:09.986343 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5faa2b3-fd6c-4380-b2a6-70749e033b35-kube-api-access-lknq8" (OuterVolumeSpecName: "kube-api-access-lknq8") pod "a5faa2b3-fd6c-4380-b2a6-70749e033b35" (UID: "a5faa2b3-fd6c-4380-b2a6-70749e033b35"). InnerVolumeSpecName "kube-api-access-lknq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:30:10 crc kubenswrapper[4771]: I0319 15:30:10.093766 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lknq8\" (UniqueName: \"kubernetes.io/projected/a5faa2b3-fd6c-4380-b2a6-70749e033b35-kube-api-access-lknq8\") on node \"crc\" DevicePath \"\"" Mar 19 15:30:10 crc kubenswrapper[4771]: I0319 15:30:10.112301 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2"] Mar 19 15:30:10 crc kubenswrapper[4771]: I0319 15:30:10.582348 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565570-jfqpb" event={"ID":"a5faa2b3-fd6c-4380-b2a6-70749e033b35","Type":"ContainerDied","Data":"a5653e4428208817363fe2134ae907175aafe3702f5ca15121c410e14f320c37"} Mar 19 15:30:10 crc kubenswrapper[4771]: I0319 15:30:10.582387 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5653e4428208817363fe2134ae907175aafe3702f5ca15121c410e14f320c37" Mar 19 15:30:10 crc kubenswrapper[4771]: I0319 15:30:10.582411 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565570-jfqpb" Mar 19 15:30:10 crc kubenswrapper[4771]: I0319 15:30:10.583880 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2" event={"ID":"1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d","Type":"ContainerStarted","Data":"fe4348bb584cb4f497d48fc87172857335c347de3b4e754e313bfca4c25897db"} Mar 19 15:30:10 crc kubenswrapper[4771]: I0319 15:30:10.583931 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2" event={"ID":"1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d","Type":"ContainerStarted","Data":"02259e3171911ab0e94a669c1f033265be86e4994fe629413225a4192e89a806"} Mar 19 15:30:10 crc kubenswrapper[4771]: I0319 15:30:10.929301 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565564-vdfl8"] Mar 19 15:30:10 crc kubenswrapper[4771]: I0319 15:30:10.936538 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565564-vdfl8"] Mar 19 15:30:11 crc kubenswrapper[4771]: I0319 15:30:11.517001 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e575b8dd-4c4b-446c-83c7-8fde3ba656ec" path="/var/lib/kubelet/pods/e575b8dd-4c4b-446c-83c7-8fde3ba656ec/volumes" Mar 19 15:30:11 crc kubenswrapper[4771]: I0319 15:30:11.593316 4771 generic.go:334] "Generic (PLEG): container finished" podID="1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d" containerID="fe4348bb584cb4f497d48fc87172857335c347de3b4e754e313bfca4c25897db" exitCode=0 Mar 19 15:30:11 crc kubenswrapper[4771]: I0319 15:30:11.593371 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2" event={"ID":"1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d","Type":"ContainerDied","Data":"fe4348bb584cb4f497d48fc87172857335c347de3b4e754e313bfca4c25897db"} Mar 19 15:30:12 crc kubenswrapper[4771]: I0319 15:30:12.505923 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-crmtc"] Mar 19 15:30:12 crc kubenswrapper[4771]: E0319 15:30:12.506413 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5faa2b3-fd6c-4380-b2a6-70749e033b35" containerName="oc" Mar 19 15:30:12 crc kubenswrapper[4771]: I0319 15:30:12.506459 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5faa2b3-fd6c-4380-b2a6-70749e033b35" containerName="oc" Mar 19 15:30:12 crc kubenswrapper[4771]: I0319 15:30:12.506683 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5faa2b3-fd6c-4380-b2a6-70749e033b35" containerName="oc" Mar 19 15:30:12 crc kubenswrapper[4771]: I0319 15:30:12.508461 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-crmtc" Mar 19 15:30:12 crc kubenswrapper[4771]: I0319 15:30:12.523173 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-crmtc"] Mar 19 15:30:12 crc kubenswrapper[4771]: I0319 15:30:12.530883 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37214cc0-ce69-44e2-80a0-ea1442abb9fa-catalog-content\") pod \"redhat-operators-crmtc\" (UID: \"37214cc0-ce69-44e2-80a0-ea1442abb9fa\") " pod="openshift-marketplace/redhat-operators-crmtc" Mar 19 15:30:12 crc kubenswrapper[4771]: I0319 15:30:12.530940 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37214cc0-ce69-44e2-80a0-ea1442abb9fa-utilities\") pod \"redhat-operators-crmtc\" (UID: \"37214cc0-ce69-44e2-80a0-ea1442abb9fa\") " pod="openshift-marketplace/redhat-operators-crmtc" Mar 19 15:30:12 crc kubenswrapper[4771]: I0319 15:30:12.531024 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76fg4\" (UniqueName: \"kubernetes.io/projected/37214cc0-ce69-44e2-80a0-ea1442abb9fa-kube-api-access-76fg4\") pod \"redhat-operators-crmtc\" (UID: \"37214cc0-ce69-44e2-80a0-ea1442abb9fa\") " pod="openshift-marketplace/redhat-operators-crmtc" Mar 19 15:30:12 crc kubenswrapper[4771]: I0319 15:30:12.633026 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76fg4\" (UniqueName: \"kubernetes.io/projected/37214cc0-ce69-44e2-80a0-ea1442abb9fa-kube-api-access-76fg4\") pod \"redhat-operators-crmtc\" (UID: \"37214cc0-ce69-44e2-80a0-ea1442abb9fa\") " pod="openshift-marketplace/redhat-operators-crmtc" Mar 19 15:30:12 crc kubenswrapper[4771]: I0319 15:30:12.633170 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37214cc0-ce69-44e2-80a0-ea1442abb9fa-catalog-content\") pod \"redhat-operators-crmtc\" (UID: \"37214cc0-ce69-44e2-80a0-ea1442abb9fa\") " pod="openshift-marketplace/redhat-operators-crmtc" Mar 19 15:30:12 crc kubenswrapper[4771]: I0319 15:30:12.633415 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37214cc0-ce69-44e2-80a0-ea1442abb9fa-utilities\") pod \"redhat-operators-crmtc\" (UID: \"37214cc0-ce69-44e2-80a0-ea1442abb9fa\") " pod="openshift-marketplace/redhat-operators-crmtc" Mar 19 15:30:12 crc kubenswrapper[4771]: I0319 15:30:12.634074 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37214cc0-ce69-44e2-80a0-ea1442abb9fa-utilities\") pod \"redhat-operators-crmtc\" (UID: \"37214cc0-ce69-44e2-80a0-ea1442abb9fa\") " pod="openshift-marketplace/redhat-operators-crmtc" Mar 19 15:30:12 crc kubenswrapper[4771]: I0319 15:30:12.634375 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37214cc0-ce69-44e2-80a0-ea1442abb9fa-catalog-content\") pod \"redhat-operators-crmtc\" (UID: \"37214cc0-ce69-44e2-80a0-ea1442abb9fa\") " pod="openshift-marketplace/redhat-operators-crmtc" Mar 19 15:30:12 crc kubenswrapper[4771]: I0319 15:30:12.664152 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76fg4\" (UniqueName: \"kubernetes.io/projected/37214cc0-ce69-44e2-80a0-ea1442abb9fa-kube-api-access-76fg4\") pod \"redhat-operators-crmtc\" (UID: \"37214cc0-ce69-44e2-80a0-ea1442abb9fa\") " pod="openshift-marketplace/redhat-operators-crmtc" Mar 19 15:30:12 crc kubenswrapper[4771]: I0319 15:30:12.844843 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-crmtc" Mar 19 15:30:13 crc kubenswrapper[4771]: I0319 15:30:13.042899 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-crmtc"] Mar 19 15:30:13 crc kubenswrapper[4771]: I0319 15:30:13.608274 4771 generic.go:334] "Generic (PLEG): container finished" podID="37214cc0-ce69-44e2-80a0-ea1442abb9fa" containerID="81d7224120ac277ee4dbf2910590850af9a54b5449f3444020c221a185e42641" exitCode=0 Mar 19 15:30:13 crc kubenswrapper[4771]: I0319 15:30:13.608322 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crmtc" event={"ID":"37214cc0-ce69-44e2-80a0-ea1442abb9fa","Type":"ContainerDied","Data":"81d7224120ac277ee4dbf2910590850af9a54b5449f3444020c221a185e42641"} Mar 19 15:30:13 crc kubenswrapper[4771]: I0319 15:30:13.608347 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crmtc" event={"ID":"37214cc0-ce69-44e2-80a0-ea1442abb9fa","Type":"ContainerStarted","Data":"a2e8d5631ae933c188201b381d535cacf06f8079c676dee7a6d8d0b80ea10463"} Mar 19 15:30:13 crc kubenswrapper[4771]: I0319 15:30:13.731577 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pgjm5" Mar 19 15:30:13 crc kubenswrapper[4771]: I0319 15:30:13.731635 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pgjm5" Mar 19 15:30:13 crc kubenswrapper[4771]: I0319 15:30:13.792451 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pgjm5" Mar 19 15:30:14 crc kubenswrapper[4771]: I0319 15:30:14.618627 4771 generic.go:334] "Generic (PLEG): container finished" podID="1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d" containerID="3c3a32b3d9bc24bb48e18d2810bea557d476e7982c7b8e37b1cca9f04643961f" exitCode=0 Mar 19 15:30:14 crc kubenswrapper[4771]: I0319 15:30:14.618746 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2" event={"ID":"1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d","Type":"ContainerDied","Data":"3c3a32b3d9bc24bb48e18d2810bea557d476e7982c7b8e37b1cca9f04643961f"} Mar 19 15:30:14 crc kubenswrapper[4771]: I0319 15:30:14.698023 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pgjm5" Mar 19 15:30:15 crc kubenswrapper[4771]: I0319 15:30:15.627514 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crmtc" event={"ID":"37214cc0-ce69-44e2-80a0-ea1442abb9fa","Type":"ContainerStarted","Data":"0aa4cad78d603a5adceaec4501e00c1203c108a67ac78005a7402e000bb89c20"} Mar 19 15:30:15 crc kubenswrapper[4771]: I0319 15:30:15.632679 4771 generic.go:334] "Generic (PLEG): container finished" podID="1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d" containerID="a769ed2e2957817fb6b832757bf36d1eecb426e29bc25ce4c414994bac6a21f3" exitCode=0 Mar 19 15:30:15 crc kubenswrapper[4771]: I0319 15:30:15.633345 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2" event={"ID":"1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d","Type":"ContainerDied","Data":"a769ed2e2957817fb6b832757bf36d1eecb426e29bc25ce4c414994bac6a21f3"} Mar 19 15:30:16 crc kubenswrapper[4771]: I0319 15:30:16.642501 4771 generic.go:334] "Generic (PLEG): container finished" podID="37214cc0-ce69-44e2-80a0-ea1442abb9fa" containerID="0aa4cad78d603a5adceaec4501e00c1203c108a67ac78005a7402e000bb89c20" exitCode=0 Mar 19 15:30:16 crc kubenswrapper[4771]: I0319 15:30:16.642607 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crmtc" event={"ID":"37214cc0-ce69-44e2-80a0-ea1442abb9fa","Type":"ContainerDied","Data":"0aa4cad78d603a5adceaec4501e00c1203c108a67ac78005a7402e000bb89c20"} Mar 19 15:30:16 crc kubenswrapper[4771]: I0319 15:30:16.947812 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2" Mar 19 15:30:17 crc kubenswrapper[4771]: I0319 15:30:17.097918 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d-util\") pod \"1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d\" (UID: \"1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d\") " Mar 19 15:30:17 crc kubenswrapper[4771]: I0319 15:30:17.098068 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmvbq\" (UniqueName: \"kubernetes.io/projected/1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d-kube-api-access-lmvbq\") pod \"1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d\" (UID: \"1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d\") " Mar 19 15:30:17 crc kubenswrapper[4771]: I0319 15:30:17.098268 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d-bundle\") pod \"1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d\" (UID: \"1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d\") " Mar 19 15:30:17 crc kubenswrapper[4771]: I0319 15:30:17.099384 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d-bundle" (OuterVolumeSpecName: "bundle") pod "1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d" (UID: "1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:30:17 crc kubenswrapper[4771]: I0319 15:30:17.104616 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d-kube-api-access-lmvbq" (OuterVolumeSpecName: "kube-api-access-lmvbq") pod "1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d" (UID: "1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d"). InnerVolumeSpecName "kube-api-access-lmvbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:30:17 crc kubenswrapper[4771]: I0319 15:30:17.108031 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d-util" (OuterVolumeSpecName: "util") pod "1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d" (UID: "1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:30:17 crc kubenswrapper[4771]: I0319 15:30:17.199838 4771 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 15:30:17 crc kubenswrapper[4771]: I0319 15:30:17.199876 4771 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d-util\") on node \"crc\" DevicePath \"\"" Mar 19 15:30:17 crc kubenswrapper[4771]: I0319 15:30:17.199886 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmvbq\" (UniqueName: \"kubernetes.io/projected/1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d-kube-api-access-lmvbq\") on node \"crc\" DevicePath \"\"" Mar 19 15:30:17 crc kubenswrapper[4771]: I0319 15:30:17.298817 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pgjm5"] Mar 19 15:30:17 crc kubenswrapper[4771]: I0319 15:30:17.299262 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pgjm5" podUID="bf0c011d-e4d5-44ea-9abb-7b5dbd903634" containerName="registry-server" containerID="cri-o://4c05fa229b646a105fdce8e61434ecbfd6b0a04dc2484ae335e787ea97825b9a" gracePeriod=2 Mar 19 15:30:17 crc kubenswrapper[4771]: I0319 15:30:17.658705 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2" event={"ID":"1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d","Type":"ContainerDied","Data":"02259e3171911ab0e94a669c1f033265be86e4994fe629413225a4192e89a806"} Mar 19 15:30:17 crc kubenswrapper[4771]: I0319 15:30:17.658917 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02259e3171911ab0e94a669c1f033265be86e4994fe629413225a4192e89a806" Mar 19 15:30:17 crc kubenswrapper[4771]: I0319 15:30:17.658871 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.075409 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-78vtb"] Mar 19 15:30:19 crc kubenswrapper[4771]: E0319 15:30:19.076102 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d" containerName="util" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.076123 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d" containerName="util" Mar 19 15:30:19 crc kubenswrapper[4771]: E0319 15:30:19.076151 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d" containerName="extract" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.076162 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d" containerName="extract" Mar 19 15:30:19 crc kubenswrapper[4771]: E0319 15:30:19.076177 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d" containerName="pull" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.076189 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d" containerName="pull" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.076333 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d" containerName="extract" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.076901 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-78vtb" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.078823 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-qrr65" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.078941 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.080447 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.088170 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-78vtb"] Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.125118 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlngh\" (UniqueName: \"kubernetes.io/projected/378cb353-a241-4bd4-910e-593931ac24cc-kube-api-access-dlngh\") pod \"nmstate-operator-796d4cfff4-78vtb\" (UID: \"378cb353-a241-4bd4-910e-593931ac24cc\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-78vtb" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.226691 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlngh\" (UniqueName: \"kubernetes.io/projected/378cb353-a241-4bd4-910e-593931ac24cc-kube-api-access-dlngh\") pod \"nmstate-operator-796d4cfff4-78vtb\" (UID: \"378cb353-a241-4bd4-910e-593931ac24cc\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-78vtb" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.242683 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlngh\" (UniqueName: \"kubernetes.io/projected/378cb353-a241-4bd4-910e-593931ac24cc-kube-api-access-dlngh\") pod \"nmstate-operator-796d4cfff4-78vtb\" (UID: \"378cb353-a241-4bd4-910e-593931ac24cc\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-78vtb" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.395833 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-78vtb" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.522851 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pgjm5" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.594610 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-78vtb"] Mar 19 15:30:19 crc kubenswrapper[4771]: W0319 15:30:19.598346 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod378cb353_a241_4bd4_910e_593931ac24cc.slice/crio-67d0626b04964f8c86d0afa23dd1dcd8adb1b8974b5d371ca4b4e1838eb3fa5b WatchSource:0}: Error finding container 67d0626b04964f8c86d0afa23dd1dcd8adb1b8974b5d371ca4b4e1838eb3fa5b: Status 404 returned error can't find the container with id 67d0626b04964f8c86d0afa23dd1dcd8adb1b8974b5d371ca4b4e1838eb3fa5b Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.636302 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf0c011d-e4d5-44ea-9abb-7b5dbd903634-utilities\") pod \"bf0c011d-e4d5-44ea-9abb-7b5dbd903634\" (UID: \"bf0c011d-e4d5-44ea-9abb-7b5dbd903634\") " Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.636402 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf0c011d-e4d5-44ea-9abb-7b5dbd903634-catalog-content\") pod \"bf0c011d-e4d5-44ea-9abb-7b5dbd903634\" (UID: \"bf0c011d-e4d5-44ea-9abb-7b5dbd903634\") " Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.636419 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvk6j\" (UniqueName: \"kubernetes.io/projected/bf0c011d-e4d5-44ea-9abb-7b5dbd903634-kube-api-access-kvk6j\") pod \"bf0c011d-e4d5-44ea-9abb-7b5dbd903634\" (UID: \"bf0c011d-e4d5-44ea-9abb-7b5dbd903634\") " Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.639058 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf0c011d-e4d5-44ea-9abb-7b5dbd903634-utilities" (OuterVolumeSpecName: "utilities") pod "bf0c011d-e4d5-44ea-9abb-7b5dbd903634" (UID: "bf0c011d-e4d5-44ea-9abb-7b5dbd903634"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.640585 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf0c011d-e4d5-44ea-9abb-7b5dbd903634-kube-api-access-kvk6j" (OuterVolumeSpecName: "kube-api-access-kvk6j") pod "bf0c011d-e4d5-44ea-9abb-7b5dbd903634" (UID: "bf0c011d-e4d5-44ea-9abb-7b5dbd903634"). InnerVolumeSpecName "kube-api-access-kvk6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.674820 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgjm5" event={"ID":"bf0c011d-e4d5-44ea-9abb-7b5dbd903634","Type":"ContainerDied","Data":"4c05fa229b646a105fdce8e61434ecbfd6b0a04dc2484ae335e787ea97825b9a"} Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.674890 4771 scope.go:117] "RemoveContainer" containerID="4c05fa229b646a105fdce8e61434ecbfd6b0a04dc2484ae335e787ea97825b9a" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.674912 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pgjm5" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.674842 4771 generic.go:334] "Generic (PLEG): container finished" podID="bf0c011d-e4d5-44ea-9abb-7b5dbd903634" containerID="4c05fa229b646a105fdce8e61434ecbfd6b0a04dc2484ae335e787ea97825b9a" exitCode=0 Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.675049 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgjm5" event={"ID":"bf0c011d-e4d5-44ea-9abb-7b5dbd903634","Type":"ContainerDied","Data":"ec6a7c9aae555956adba47dcceffe2930b4e18aef96aab45e6e2807bd4112552"} Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.677402 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-78vtb" event={"ID":"378cb353-a241-4bd4-910e-593931ac24cc","Type":"ContainerStarted","Data":"67d0626b04964f8c86d0afa23dd1dcd8adb1b8974b5d371ca4b4e1838eb3fa5b"} Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.696515 4771 scope.go:117] "RemoveContainer" containerID="66e93cb93fe4f1c6ec61b5d27f3adbdf8f4ec0e6397eb4ed4f7ef7cfc38f0fc8" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.715243 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf0c011d-e4d5-44ea-9abb-7b5dbd903634-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf0c011d-e4d5-44ea-9abb-7b5dbd903634" (UID: "bf0c011d-e4d5-44ea-9abb-7b5dbd903634"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.730144 4771 scope.go:117] "RemoveContainer" containerID="021bfb2a39276ce2f416d0b39572008eac04cf48cd9a12dd6329845e83917880" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.738699 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf0c011d-e4d5-44ea-9abb-7b5dbd903634-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.738739 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf0c011d-e4d5-44ea-9abb-7b5dbd903634-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.738755 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvk6j\" (UniqueName: \"kubernetes.io/projected/bf0c011d-e4d5-44ea-9abb-7b5dbd903634-kube-api-access-kvk6j\") on node \"crc\" DevicePath \"\"" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.754711 4771 scope.go:117] "RemoveContainer" containerID="4c05fa229b646a105fdce8e61434ecbfd6b0a04dc2484ae335e787ea97825b9a" Mar 19 15:30:19 crc kubenswrapper[4771]: E0319 15:30:19.755359 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c05fa229b646a105fdce8e61434ecbfd6b0a04dc2484ae335e787ea97825b9a\": container with ID starting with 4c05fa229b646a105fdce8e61434ecbfd6b0a04dc2484ae335e787ea97825b9a not found: ID does not exist" containerID="4c05fa229b646a105fdce8e61434ecbfd6b0a04dc2484ae335e787ea97825b9a" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.755391 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c05fa229b646a105fdce8e61434ecbfd6b0a04dc2484ae335e787ea97825b9a"} err="failed to get container status \"4c05fa229b646a105fdce8e61434ecbfd6b0a04dc2484ae335e787ea97825b9a\": rpc error: code = NotFound desc = could not find container \"4c05fa229b646a105fdce8e61434ecbfd6b0a04dc2484ae335e787ea97825b9a\": container with ID starting with 4c05fa229b646a105fdce8e61434ecbfd6b0a04dc2484ae335e787ea97825b9a not found: ID does not exist" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.755410 4771 scope.go:117] "RemoveContainer" containerID="66e93cb93fe4f1c6ec61b5d27f3adbdf8f4ec0e6397eb4ed4f7ef7cfc38f0fc8" Mar 19 15:30:19 crc kubenswrapper[4771]: E0319 15:30:19.755645 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66e93cb93fe4f1c6ec61b5d27f3adbdf8f4ec0e6397eb4ed4f7ef7cfc38f0fc8\": container with ID starting with 66e93cb93fe4f1c6ec61b5d27f3adbdf8f4ec0e6397eb4ed4f7ef7cfc38f0fc8 not found: ID does not exist" containerID="66e93cb93fe4f1c6ec61b5d27f3adbdf8f4ec0e6397eb4ed4f7ef7cfc38f0fc8" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.755667 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66e93cb93fe4f1c6ec61b5d27f3adbdf8f4ec0e6397eb4ed4f7ef7cfc38f0fc8"} err="failed to get container status \"66e93cb93fe4f1c6ec61b5d27f3adbdf8f4ec0e6397eb4ed4f7ef7cfc38f0fc8\": rpc error: code = NotFound desc = could not find container \"66e93cb93fe4f1c6ec61b5d27f3adbdf8f4ec0e6397eb4ed4f7ef7cfc38f0fc8\": container with ID starting with 66e93cb93fe4f1c6ec61b5d27f3adbdf8f4ec0e6397eb4ed4f7ef7cfc38f0fc8 not found: ID does not exist" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.755678 4771 scope.go:117] "RemoveContainer" containerID="021bfb2a39276ce2f416d0b39572008eac04cf48cd9a12dd6329845e83917880" Mar 19 15:30:19 crc kubenswrapper[4771]: E0319 15:30:19.756068 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"021bfb2a39276ce2f416d0b39572008eac04cf48cd9a12dd6329845e83917880\": container with ID starting with 021bfb2a39276ce2f416d0b39572008eac04cf48cd9a12dd6329845e83917880 not found: ID does not exist" containerID="021bfb2a39276ce2f416d0b39572008eac04cf48cd9a12dd6329845e83917880" Mar 19 15:30:19 crc kubenswrapper[4771]: I0319 15:30:19.756097 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"021bfb2a39276ce2f416d0b39572008eac04cf48cd9a12dd6329845e83917880"} err="failed to get container status \"021bfb2a39276ce2f416d0b39572008eac04cf48cd9a12dd6329845e83917880\": rpc error: code = NotFound desc = could not find container \"021bfb2a39276ce2f416d0b39572008eac04cf48cd9a12dd6329845e83917880\": container with ID starting with 021bfb2a39276ce2f416d0b39572008eac04cf48cd9a12dd6329845e83917880 not found: ID does not exist" Mar 19 15:30:20 crc kubenswrapper[4771]: I0319 15:30:20.000625 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pgjm5"] Mar 19 15:30:20 crc kubenswrapper[4771]: I0319 15:30:20.007717 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pgjm5"] Mar 19 15:30:21 crc kubenswrapper[4771]: I0319 15:30:21.520734 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf0c011d-e4d5-44ea-9abb-7b5dbd903634" path="/var/lib/kubelet/pods/bf0c011d-e4d5-44ea-9abb-7b5dbd903634/volumes" Mar 19 15:30:22 crc kubenswrapper[4771]: I0319 15:30:22.713541 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crmtc" event={"ID":"37214cc0-ce69-44e2-80a0-ea1442abb9fa","Type":"ContainerStarted","Data":"c49c670822eaf421db7e93cd26ddcce9e6753f823c980a200e8a96608a69132f"} Mar 19 15:30:22 crc kubenswrapper[4771]: I0319 15:30:22.741372 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-crmtc" podStartSLOduration=2.8366705100000003 podStartE2EDuration="10.741354107s" podCreationTimestamp="2026-03-19 15:30:12 +0000 UTC" firstStartedPulling="2026-03-19 15:30:13.609831732 +0000 UTC m=+872.838452934" lastFinishedPulling="2026-03-19 15:30:21.514515319 +0000 UTC m=+880.743136531" observedRunningTime="2026-03-19 15:30:22.739312986 +0000 UTC m=+881.967934218" watchObservedRunningTime="2026-03-19 15:30:22.741354107 +0000 UTC m=+881.969975319" Mar 19 15:30:22 crc kubenswrapper[4771]: I0319 15:30:22.845023 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-crmtc" Mar 19 15:30:22 crc kubenswrapper[4771]: I0319 15:30:22.845070 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-crmtc" Mar 19 15:30:23 crc kubenswrapper[4771]: I0319 15:30:23.027279 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:30:23 crc kubenswrapper[4771]: I0319 15:30:23.027348 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:30:23 crc kubenswrapper[4771]: I0319 15:30:23.027399 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" Mar 19 15:30:23 crc kubenswrapper[4771]: I0319 15:30:23.028065 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b1c74e779cae295d8261f06e3e0b3804798c6e48c30f016b0eaeacf144fab8c8"} pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 15:30:23 crc kubenswrapper[4771]: I0319 15:30:23.028124 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" containerID="cri-o://b1c74e779cae295d8261f06e3e0b3804798c6e48c30f016b0eaeacf144fab8c8" gracePeriod=600 Mar 19 15:30:23 crc kubenswrapper[4771]: I0319 15:30:23.722439 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-78vtb" event={"ID":"378cb353-a241-4bd4-910e-593931ac24cc","Type":"ContainerStarted","Data":"f7dc7bbdf9dcb34bf2e4d115feee315b696bb52c4f3d7f94e710fc701fc55f61"} Mar 19 15:30:23 crc kubenswrapper[4771]: I0319 15:30:23.727381 4771 generic.go:334] "Generic (PLEG): container finished" podID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerID="b1c74e779cae295d8261f06e3e0b3804798c6e48c30f016b0eaeacf144fab8c8" exitCode=0 Mar 19 15:30:23 crc kubenswrapper[4771]: I0319 15:30:23.727484 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" event={"ID":"f2b6e948-bbef-4217-b0eb-4cdbf711037c","Type":"ContainerDied","Data":"b1c74e779cae295d8261f06e3e0b3804798c6e48c30f016b0eaeacf144fab8c8"} Mar 19 15:30:23 crc kubenswrapper[4771]: I0319 15:30:23.727546 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" event={"ID":"f2b6e948-bbef-4217-b0eb-4cdbf711037c","Type":"ContainerStarted","Data":"ed5a553c2d92c915ce47410116c0fc185162ea3ab77f14a7e2453e14985c8a40"} Mar 19 15:30:23 crc kubenswrapper[4771]: I0319 15:30:23.727574 4771 scope.go:117] "RemoveContainer" containerID="41fe4b028ed4c1241b67194aaa2a009f141466dd206828b233686513e2dbdf58" Mar 19 15:30:23 crc kubenswrapper[4771]: I0319 15:30:23.756874 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-78vtb" podStartSLOduration=1.058378564 podStartE2EDuration="4.75685978s" podCreationTimestamp="2026-03-19 15:30:19 +0000 UTC" firstStartedPulling="2026-03-19 15:30:19.600279602 +0000 UTC m=+878.828900794" lastFinishedPulling="2026-03-19 15:30:23.298760798 +0000 UTC m=+882.527382010" observedRunningTime="2026-03-19 15:30:23.75328984 +0000 UTC m=+882.981911042" watchObservedRunningTime="2026-03-19 15:30:23.75685978 +0000 UTC m=+882.985480982" Mar 19 15:30:23 crc kubenswrapper[4771]: I0319 15:30:23.902851 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-crmtc" podUID="37214cc0-ce69-44e2-80a0-ea1442abb9fa" containerName="registry-server" probeResult="failure" output=< Mar 19 15:30:23 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Mar 19 15:30:23 crc kubenswrapper[4771]: > Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.065291 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-z466c"] Mar 19 15:30:29 crc kubenswrapper[4771]: E0319 15:30:29.066009 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf0c011d-e4d5-44ea-9abb-7b5dbd903634" containerName="extract-content" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.066024 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf0c011d-e4d5-44ea-9abb-7b5dbd903634" containerName="extract-content" Mar 19 15:30:29 crc kubenswrapper[4771]: E0319 15:30:29.066040 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf0c011d-e4d5-44ea-9abb-7b5dbd903634" containerName="registry-server" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.066049 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf0c011d-e4d5-44ea-9abb-7b5dbd903634" containerName="registry-server" Mar 19 15:30:29 crc kubenswrapper[4771]: E0319 15:30:29.066072 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf0c011d-e4d5-44ea-9abb-7b5dbd903634" containerName="extract-utilities" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.066081 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf0c011d-e4d5-44ea-9abb-7b5dbd903634" containerName="extract-utilities" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.066223 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf0c011d-e4d5-44ea-9abb-7b5dbd903634" containerName="registry-server" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.067181 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z466c" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.068755 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tmtn\" (UniqueName: \"kubernetes.io/projected/b72761bc-3ae8-4464-9544-d0ed1781f1e5-kube-api-access-7tmtn\") pod \"nmstate-metrics-9b8c8685d-z466c\" (UID: \"b72761bc-3ae8-4464-9544-d0ed1781f1e5\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z466c" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.075805 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-m2mxb"] Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.076661 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m2mxb" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.080298 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-fcjkm" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.084588 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.087767 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-z466c"] Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.095229 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-m2mxb"] Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.106304 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-ctqt6"] Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.107091 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-ctqt6" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.169702 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c6fd6496-7123-4ff3-adea-c38716b6a50a-ovs-socket\") pod \"nmstate-handler-ctqt6\" (UID: \"c6fd6496-7123-4ff3-adea-c38716b6a50a\") " pod="openshift-nmstate/nmstate-handler-ctqt6" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.169756 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stq72\" (UniqueName: \"kubernetes.io/projected/f9f6d719-a63b-4b45-a13c-64480e0dcc69-kube-api-access-stq72\") pod \"nmstate-webhook-5f558f5558-m2mxb\" (UID: \"f9f6d719-a63b-4b45-a13c-64480e0dcc69\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-m2mxb" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.169782 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f9f6d719-a63b-4b45-a13c-64480e0dcc69-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-m2mxb\" (UID: \"f9f6d719-a63b-4b45-a13c-64480e0dcc69\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-m2mxb" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.169806 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c6fd6496-7123-4ff3-adea-c38716b6a50a-dbus-socket\") pod \"nmstate-handler-ctqt6\" (UID: \"c6fd6496-7123-4ff3-adea-c38716b6a50a\") " pod="openshift-nmstate/nmstate-handler-ctqt6" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.169828 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tmtn\" (UniqueName: \"kubernetes.io/projected/b72761bc-3ae8-4464-9544-d0ed1781f1e5-kube-api-access-7tmtn\") pod \"nmstate-metrics-9b8c8685d-z466c\" (UID: \"b72761bc-3ae8-4464-9544-d0ed1781f1e5\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z466c" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.169847 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df7hb\" (UniqueName: \"kubernetes.io/projected/c6fd6496-7123-4ff3-adea-c38716b6a50a-kube-api-access-df7hb\") pod \"nmstate-handler-ctqt6\" (UID: \"c6fd6496-7123-4ff3-adea-c38716b6a50a\") " pod="openshift-nmstate/nmstate-handler-ctqt6" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.170105 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c6fd6496-7123-4ff3-adea-c38716b6a50a-nmstate-lock\") pod \"nmstate-handler-ctqt6\" (UID: \"c6fd6496-7123-4ff3-adea-c38716b6a50a\") " pod="openshift-nmstate/nmstate-handler-ctqt6" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.191424 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tmtn\" (UniqueName: \"kubernetes.io/projected/b72761bc-3ae8-4464-9544-d0ed1781f1e5-kube-api-access-7tmtn\") pod \"nmstate-metrics-9b8c8685d-z466c\" (UID: \"b72761bc-3ae8-4464-9544-d0ed1781f1e5\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z466c" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.200678 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-2rxdq"] Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.201652 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2rxdq" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.204386 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.204630 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-kpbtr" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.208022 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.210169 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-2rxdq"] Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.270890 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc9vc\" (UniqueName: \"kubernetes.io/projected/0ff074f0-de88-4680-a784-82e407cb6a11-kube-api-access-vc9vc\") pod \"nmstate-console-plugin-86f58fcf4-2rxdq\" (UID: \"0ff074f0-de88-4680-a784-82e407cb6a11\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2rxdq" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.270954 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c6fd6496-7123-4ff3-adea-c38716b6a50a-dbus-socket\") pod \"nmstate-handler-ctqt6\" (UID: \"c6fd6496-7123-4ff3-adea-c38716b6a50a\") " pod="openshift-nmstate/nmstate-handler-ctqt6" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.271011 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df7hb\" (UniqueName: \"kubernetes.io/projected/c6fd6496-7123-4ff3-adea-c38716b6a50a-kube-api-access-df7hb\") pod \"nmstate-handler-ctqt6\" (UID: \"c6fd6496-7123-4ff3-adea-c38716b6a50a\") " pod="openshift-nmstate/nmstate-handler-ctqt6" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.271049 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0ff074f0-de88-4680-a784-82e407cb6a11-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-2rxdq\" (UID: \"0ff074f0-de88-4680-a784-82e407cb6a11\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2rxdq" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.271086 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c6fd6496-7123-4ff3-adea-c38716b6a50a-nmstate-lock\") pod \"nmstate-handler-ctqt6\" (UID: \"c6fd6496-7123-4ff3-adea-c38716b6a50a\") " pod="openshift-nmstate/nmstate-handler-ctqt6" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.271133 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c6fd6496-7123-4ff3-adea-c38716b6a50a-ovs-socket\") pod \"nmstate-handler-ctqt6\" (UID: \"c6fd6496-7123-4ff3-adea-c38716b6a50a\") " pod="openshift-nmstate/nmstate-handler-ctqt6" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.271262 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c6fd6496-7123-4ff3-adea-c38716b6a50a-nmstate-lock\") pod \"nmstate-handler-ctqt6\" (UID: \"c6fd6496-7123-4ff3-adea-c38716b6a50a\") " pod="openshift-nmstate/nmstate-handler-ctqt6" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.271211 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ff074f0-de88-4680-a784-82e407cb6a11-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-2rxdq\" (UID: \"0ff074f0-de88-4680-a784-82e407cb6a11\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2rxdq" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.271274 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c6fd6496-7123-4ff3-adea-c38716b6a50a-ovs-socket\") pod \"nmstate-handler-ctqt6\" (UID: \"c6fd6496-7123-4ff3-adea-c38716b6a50a\") " pod="openshift-nmstate/nmstate-handler-ctqt6" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.271355 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stq72\" (UniqueName: \"kubernetes.io/projected/f9f6d719-a63b-4b45-a13c-64480e0dcc69-kube-api-access-stq72\") pod \"nmstate-webhook-5f558f5558-m2mxb\" (UID: \"f9f6d719-a63b-4b45-a13c-64480e0dcc69\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-m2mxb" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.271370 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c6fd6496-7123-4ff3-adea-c38716b6a50a-dbus-socket\") pod \"nmstate-handler-ctqt6\" (UID: \"c6fd6496-7123-4ff3-adea-c38716b6a50a\") " pod="openshift-nmstate/nmstate-handler-ctqt6" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.271410 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f9f6d719-a63b-4b45-a13c-64480e0dcc69-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-m2mxb\" (UID: \"f9f6d719-a63b-4b45-a13c-64480e0dcc69\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-m2mxb" Mar 19 15:30:29 crc kubenswrapper[4771]: E0319 15:30:29.271539 4771 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 19 15:30:29 crc kubenswrapper[4771]: E0319 15:30:29.271589 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9f6d719-a63b-4b45-a13c-64480e0dcc69-tls-key-pair podName:f9f6d719-a63b-4b45-a13c-64480e0dcc69 nodeName:}" failed. No retries permitted until 2026-03-19 15:30:29.771571616 +0000 UTC m=+889.000192818 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/f9f6d719-a63b-4b45-a13c-64480e0dcc69-tls-key-pair") pod "nmstate-webhook-5f558f5558-m2mxb" (UID: "f9f6d719-a63b-4b45-a13c-64480e0dcc69") : secret "openshift-nmstate-webhook" not found Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.288393 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stq72\" (UniqueName: \"kubernetes.io/projected/f9f6d719-a63b-4b45-a13c-64480e0dcc69-kube-api-access-stq72\") pod \"nmstate-webhook-5f558f5558-m2mxb\" (UID: \"f9f6d719-a63b-4b45-a13c-64480e0dcc69\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-m2mxb" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.293625 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df7hb\" (UniqueName: \"kubernetes.io/projected/c6fd6496-7123-4ff3-adea-c38716b6a50a-kube-api-access-df7hb\") pod \"nmstate-handler-ctqt6\" (UID: \"c6fd6496-7123-4ff3-adea-c38716b6a50a\") " pod="openshift-nmstate/nmstate-handler-ctqt6" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.371969 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ff074f0-de88-4680-a784-82e407cb6a11-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-2rxdq\" (UID: \"0ff074f0-de88-4680-a784-82e407cb6a11\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2rxdq" Mar 19 15:30:29 crc kubenswrapper[4771]: E0319 15:30:29.372131 4771 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.372344 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc9vc\" (UniqueName: \"kubernetes.io/projected/0ff074f0-de88-4680-a784-82e407cb6a11-kube-api-access-vc9vc\") pod \"nmstate-console-plugin-86f58fcf4-2rxdq\" (UID: \"0ff074f0-de88-4680-a784-82e407cb6a11\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2rxdq" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.372391 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0ff074f0-de88-4680-a784-82e407cb6a11-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-2rxdq\" (UID: \"0ff074f0-de88-4680-a784-82e407cb6a11\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2rxdq" Mar 19 15:30:29 crc kubenswrapper[4771]: E0319 15:30:29.372413 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ff074f0-de88-4680-a784-82e407cb6a11-plugin-serving-cert podName:0ff074f0-de88-4680-a784-82e407cb6a11 nodeName:}" failed. No retries permitted until 2026-03-19 15:30:29.872389363 +0000 UTC m=+889.101010775 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/0ff074f0-de88-4680-a784-82e407cb6a11-plugin-serving-cert") pod "nmstate-console-plugin-86f58fcf4-2rxdq" (UID: "0ff074f0-de88-4680-a784-82e407cb6a11") : secret "plugin-serving-cert" not found Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.373509 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0ff074f0-de88-4680-a784-82e407cb6a11-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-2rxdq\" (UID: \"0ff074f0-de88-4680-a784-82e407cb6a11\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2rxdq" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.384120 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z466c" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.385944 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-8f6c7759b-l576j"] Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.386750 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8f6c7759b-l576j" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.400171 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc9vc\" (UniqueName: \"kubernetes.io/projected/0ff074f0-de88-4680-a784-82e407cb6a11-kube-api-access-vc9vc\") pod \"nmstate-console-plugin-86f58fcf4-2rxdq\" (UID: \"0ff074f0-de88-4680-a784-82e407cb6a11\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2rxdq" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.421170 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-ctqt6" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.465518 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8f6c7759b-l576j"] Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.472821 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8e2c6c3-df6b-4255-a298-2244247c47fc-trusted-ca-bundle\") pod \"console-8f6c7759b-l576j\" (UID: \"f8e2c6c3-df6b-4255-a298-2244247c47fc\") " pod="openshift-console/console-8f6c7759b-l576j" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.472865 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f8e2c6c3-df6b-4255-a298-2244247c47fc-service-ca\") pod \"console-8f6c7759b-l576j\" (UID: \"f8e2c6c3-df6b-4255-a298-2244247c47fc\") " pod="openshift-console/console-8f6c7759b-l576j" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.472910 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8e2c6c3-df6b-4255-a298-2244247c47fc-console-serving-cert\") pod \"console-8f6c7759b-l576j\" (UID: \"f8e2c6c3-df6b-4255-a298-2244247c47fc\") " pod="openshift-console/console-8f6c7759b-l576j" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.472927 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvdk9\" (UniqueName: \"kubernetes.io/projected/f8e2c6c3-df6b-4255-a298-2244247c47fc-kube-api-access-tvdk9\") pod \"console-8f6c7759b-l576j\" (UID: \"f8e2c6c3-df6b-4255-a298-2244247c47fc\") " pod="openshift-console/console-8f6c7759b-l576j" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.472943 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f8e2c6c3-df6b-4255-a298-2244247c47fc-console-oauth-config\") pod \"console-8f6c7759b-l576j\" (UID: \"f8e2c6c3-df6b-4255-a298-2244247c47fc\") " pod="openshift-console/console-8f6c7759b-l576j" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.472974 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f8e2c6c3-df6b-4255-a298-2244247c47fc-console-config\") pod \"console-8f6c7759b-l576j\" (UID: \"f8e2c6c3-df6b-4255-a298-2244247c47fc\") " pod="openshift-console/console-8f6c7759b-l576j" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.473010 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f8e2c6c3-df6b-4255-a298-2244247c47fc-oauth-serving-cert\") pod \"console-8f6c7759b-l576j\" (UID: \"f8e2c6c3-df6b-4255-a298-2244247c47fc\") " pod="openshift-console/console-8f6c7759b-l576j" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.574477 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8e2c6c3-df6b-4255-a298-2244247c47fc-trusted-ca-bundle\") pod \"console-8f6c7759b-l576j\" (UID: \"f8e2c6c3-df6b-4255-a298-2244247c47fc\") " pod="openshift-console/console-8f6c7759b-l576j" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.574533 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f8e2c6c3-df6b-4255-a298-2244247c47fc-service-ca\") pod \"console-8f6c7759b-l576j\" (UID: \"f8e2c6c3-df6b-4255-a298-2244247c47fc\") " pod="openshift-console/console-8f6c7759b-l576j" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.574598 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8e2c6c3-df6b-4255-a298-2244247c47fc-console-serving-cert\") pod \"console-8f6c7759b-l576j\" (UID: \"f8e2c6c3-df6b-4255-a298-2244247c47fc\") " pod="openshift-console/console-8f6c7759b-l576j" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.574627 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvdk9\" (UniqueName: \"kubernetes.io/projected/f8e2c6c3-df6b-4255-a298-2244247c47fc-kube-api-access-tvdk9\") pod \"console-8f6c7759b-l576j\" (UID: \"f8e2c6c3-df6b-4255-a298-2244247c47fc\") " pod="openshift-console/console-8f6c7759b-l576j" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.574656 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f8e2c6c3-df6b-4255-a298-2244247c47fc-console-oauth-config\") pod \"console-8f6c7759b-l576j\" (UID: \"f8e2c6c3-df6b-4255-a298-2244247c47fc\") " pod="openshift-console/console-8f6c7759b-l576j" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.574703 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f8e2c6c3-df6b-4255-a298-2244247c47fc-console-config\") pod \"console-8f6c7759b-l576j\" (UID: \"f8e2c6c3-df6b-4255-a298-2244247c47fc\") " pod="openshift-console/console-8f6c7759b-l576j" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.574729 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f8e2c6c3-df6b-4255-a298-2244247c47fc-oauth-serving-cert\") pod \"console-8f6c7759b-l576j\" (UID: \"f8e2c6c3-df6b-4255-a298-2244247c47fc\") " pod="openshift-console/console-8f6c7759b-l576j" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.575726 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f8e2c6c3-df6b-4255-a298-2244247c47fc-console-config\") pod \"console-8f6c7759b-l576j\" (UID: \"f8e2c6c3-df6b-4255-a298-2244247c47fc\") " pod="openshift-console/console-8f6c7759b-l576j" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.575749 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8e2c6c3-df6b-4255-a298-2244247c47fc-trusted-ca-bundle\") pod \"console-8f6c7759b-l576j\" (UID: \"f8e2c6c3-df6b-4255-a298-2244247c47fc\") " pod="openshift-console/console-8f6c7759b-l576j" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.576258 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f8e2c6c3-df6b-4255-a298-2244247c47fc-oauth-serving-cert\") pod \"console-8f6c7759b-l576j\" (UID: \"f8e2c6c3-df6b-4255-a298-2244247c47fc\") " pod="openshift-console/console-8f6c7759b-l576j" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.576312 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f8e2c6c3-df6b-4255-a298-2244247c47fc-service-ca\") pod \"console-8f6c7759b-l576j\" (UID: \"f8e2c6c3-df6b-4255-a298-2244247c47fc\") " pod="openshift-console/console-8f6c7759b-l576j" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.577832 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f8e2c6c3-df6b-4255-a298-2244247c47fc-console-oauth-config\") pod \"console-8f6c7759b-l576j\" (UID: \"f8e2c6c3-df6b-4255-a298-2244247c47fc\") " pod="openshift-console/console-8f6c7759b-l576j" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.581710 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8e2c6c3-df6b-4255-a298-2244247c47fc-console-serving-cert\") pod \"console-8f6c7759b-l576j\" (UID: \"f8e2c6c3-df6b-4255-a298-2244247c47fc\") " pod="openshift-console/console-8f6c7759b-l576j" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.589649 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvdk9\" (UniqueName: \"kubernetes.io/projected/f8e2c6c3-df6b-4255-a298-2244247c47fc-kube-api-access-tvdk9\") pod \"console-8f6c7759b-l576j\" (UID: \"f8e2c6c3-df6b-4255-a298-2244247c47fc\") " pod="openshift-console/console-8f6c7759b-l576j" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.700907 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8f6c7759b-l576j" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.776294 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f9f6d719-a63b-4b45-a13c-64480e0dcc69-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-m2mxb\" (UID: \"f9f6d719-a63b-4b45-a13c-64480e0dcc69\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-m2mxb" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.779605 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f9f6d719-a63b-4b45-a13c-64480e0dcc69-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-m2mxb\" (UID: \"f9f6d719-a63b-4b45-a13c-64480e0dcc69\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-m2mxb" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.783147 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-ctqt6" event={"ID":"c6fd6496-7123-4ff3-adea-c38716b6a50a","Type":"ContainerStarted","Data":"0ffa34b261883990083f4c2ae5849828fe70cee89f582d176b7d1e608bfde5ca"} Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.855783 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-z466c"] Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.876919 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ff074f0-de88-4680-a784-82e407cb6a11-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-2rxdq\" (UID: \"0ff074f0-de88-4680-a784-82e407cb6a11\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2rxdq" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.882445 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ff074f0-de88-4680-a784-82e407cb6a11-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-2rxdq\" (UID: \"0ff074f0-de88-4680-a784-82e407cb6a11\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2rxdq" Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.959881 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8f6c7759b-l576j"] Mar 19 15:30:29 crc kubenswrapper[4771]: W0319 15:30:29.965974 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8e2c6c3_df6b_4255_a298_2244247c47fc.slice/crio-b377924a6382054364b326f1f98445690fbaedb97ce0f780dee1f713a174383c WatchSource:0}: Error finding container b377924a6382054364b326f1f98445690fbaedb97ce0f780dee1f713a174383c: Status 404 returned error can't find the container with id b377924a6382054364b326f1f98445690fbaedb97ce0f780dee1f713a174383c Mar 19 15:30:29 crc kubenswrapper[4771]: I0319 15:30:29.995094 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m2mxb" Mar 19 15:30:30 crc kubenswrapper[4771]: I0319 15:30:30.127668 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2rxdq" Mar 19 15:30:30 crc kubenswrapper[4771]: I0319 15:30:30.228854 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-m2mxb"] Mar 19 15:30:30 crc kubenswrapper[4771]: W0319 15:30:30.246798 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9f6d719_a63b_4b45_a13c_64480e0dcc69.slice/crio-0ee0363cae973855ea2e47453ea5b59afd31e0c8564d32ddce053b1522b01ee1 WatchSource:0}: Error finding container 0ee0363cae973855ea2e47453ea5b59afd31e0c8564d32ddce053b1522b01ee1: Status 404 returned error can't find the container with id 0ee0363cae973855ea2e47453ea5b59afd31e0c8564d32ddce053b1522b01ee1 Mar 19 15:30:30 crc kubenswrapper[4771]: I0319 15:30:30.341064 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-2rxdq"] Mar 19 15:30:30 crc kubenswrapper[4771]: W0319 15:30:30.345515 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ff074f0_de88_4680_a784_82e407cb6a11.slice/crio-bccfabfbc3bbdc09d8669c0201dff237b0e1da11e9d19974df26dc13445a7a63 WatchSource:0}: Error finding container bccfabfbc3bbdc09d8669c0201dff237b0e1da11e9d19974df26dc13445a7a63: Status 404 returned error can't find the container with id bccfabfbc3bbdc09d8669c0201dff237b0e1da11e9d19974df26dc13445a7a63 Mar 19 15:30:30 crc kubenswrapper[4771]: I0319 15:30:30.793364 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2rxdq" event={"ID":"0ff074f0-de88-4680-a784-82e407cb6a11","Type":"ContainerStarted","Data":"bccfabfbc3bbdc09d8669c0201dff237b0e1da11e9d19974df26dc13445a7a63"} Mar 19 15:30:30 crc kubenswrapper[4771]: I0319 15:30:30.795319 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m2mxb" event={"ID":"f9f6d719-a63b-4b45-a13c-64480e0dcc69","Type":"ContainerStarted","Data":"0ee0363cae973855ea2e47453ea5b59afd31e0c8564d32ddce053b1522b01ee1"} Mar 19 15:30:30 crc kubenswrapper[4771]: I0319 15:30:30.797211 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z466c" event={"ID":"b72761bc-3ae8-4464-9544-d0ed1781f1e5","Type":"ContainerStarted","Data":"1e061394576f80d5ee0df85f824ff77b443b6ff06f7d5f9972b4c948782ea8d8"} Mar 19 15:30:30 crc kubenswrapper[4771]: I0319 15:30:30.799806 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8f6c7759b-l576j" event={"ID":"f8e2c6c3-df6b-4255-a298-2244247c47fc","Type":"ContainerStarted","Data":"04f564d7d99ba6ae9287bc39a327c24aceda0b2fd87aa572db57ffa193561b0e"} Mar 19 15:30:30 crc kubenswrapper[4771]: I0319 15:30:30.799852 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8f6c7759b-l576j" event={"ID":"f8e2c6c3-df6b-4255-a298-2244247c47fc","Type":"ContainerStarted","Data":"b377924a6382054364b326f1f98445690fbaedb97ce0f780dee1f713a174383c"} Mar 19 15:30:30 crc kubenswrapper[4771]: I0319 15:30:30.840456 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8f6c7759b-l576j" podStartSLOduration=1.8403690959999999 podStartE2EDuration="1.840369096s" podCreationTimestamp="2026-03-19 15:30:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:30:30.832662902 +0000 UTC m=+890.061284164" watchObservedRunningTime="2026-03-19 15:30:30.840369096 +0000 UTC m=+890.068990328" Mar 19 15:30:32 crc kubenswrapper[4771]: I0319 15:30:32.817314 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z466c" event={"ID":"b72761bc-3ae8-4464-9544-d0ed1781f1e5","Type":"ContainerStarted","Data":"365338f1d2565bfa57eac0364cdc5eed47080aef5b61f90ea35d157a5c35c66d"} Mar 19 15:30:32 crc kubenswrapper[4771]: I0319 15:30:32.819220 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m2mxb" event={"ID":"f9f6d719-a63b-4b45-a13c-64480e0dcc69","Type":"ContainerStarted","Data":"3acdd8f057c821331cb8796bea7c078be23613d81df2e2d0a4efc214710e82a7"} Mar 19 15:30:32 crc kubenswrapper[4771]: I0319 15:30:32.820695 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m2mxb" Mar 19 15:30:32 crc kubenswrapper[4771]: I0319 15:30:32.823920 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-ctqt6" event={"ID":"c6fd6496-7123-4ff3-adea-c38716b6a50a","Type":"ContainerStarted","Data":"da4e4172b7b9c68bdfbfaa49bacd79d7eb6f12896b54c7b15efb2f797b03a127"} Mar 19 15:30:32 crc kubenswrapper[4771]: I0319 15:30:32.824269 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-ctqt6" Mar 19 15:30:32 crc kubenswrapper[4771]: I0319 15:30:32.851081 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m2mxb" podStartSLOduration=1.760846393 podStartE2EDuration="3.850983389s" podCreationTimestamp="2026-03-19 15:30:29 +0000 UTC" firstStartedPulling="2026-03-19 15:30:30.249868466 +0000 UTC m=+889.478489668" lastFinishedPulling="2026-03-19 15:30:32.340005462 +0000 UTC m=+891.568626664" observedRunningTime="2026-03-19 15:30:32.839531492 +0000 UTC m=+892.068152764" watchObservedRunningTime="2026-03-19 15:30:32.850983389 +0000 UTC m=+892.079604631" Mar 19 15:30:32 crc kubenswrapper[4771]: I0319 15:30:32.871178 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-ctqt6" podStartSLOduration=0.957930888 podStartE2EDuration="3.871141803s" podCreationTimestamp="2026-03-19 15:30:29 +0000 UTC" firstStartedPulling="2026-03-19 15:30:29.446232764 +0000 UTC m=+888.674853966" lastFinishedPulling="2026-03-19 15:30:32.359443679 +0000 UTC m=+891.588064881" observedRunningTime="2026-03-19 15:30:32.862750263 +0000 UTC m=+892.091371565" watchObservedRunningTime="2026-03-19 15:30:32.871141803 +0000 UTC m=+892.099763025" Mar 19 15:30:32 crc kubenswrapper[4771]: I0319 15:30:32.915414 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-crmtc" Mar 19 15:30:32 crc kubenswrapper[4771]: I0319 15:30:32.951601 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-crmtc" Mar 19 15:30:33 crc kubenswrapper[4771]: I0319 15:30:33.146912 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-crmtc"] Mar 19 15:30:34 crc kubenswrapper[4771]: I0319 15:30:34.845464 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2rxdq" event={"ID":"0ff074f0-de88-4680-a784-82e407cb6a11","Type":"ContainerStarted","Data":"519cd3bb315c2b277ced2f0adb30860d2e1d1a498e854cf34349b3ab8190e4e9"} Mar 19 15:30:34 crc kubenswrapper[4771]: I0319 15:30:34.845890 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-crmtc" podUID="37214cc0-ce69-44e2-80a0-ea1442abb9fa" containerName="registry-server" containerID="cri-o://c49c670822eaf421db7e93cd26ddcce9e6753f823c980a200e8a96608a69132f" gracePeriod=2 Mar 19 15:30:34 crc kubenswrapper[4771]: I0319 15:30:34.879976 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2rxdq" podStartSLOduration=2.353854088 podStartE2EDuration="5.879954783s" podCreationTimestamp="2026-03-19 15:30:29 +0000 UTC" firstStartedPulling="2026-03-19 15:30:30.347242817 +0000 UTC m=+889.575864019" lastFinishedPulling="2026-03-19 15:30:33.873343502 +0000 UTC m=+893.101964714" observedRunningTime="2026-03-19 15:30:34.877524522 +0000 UTC m=+894.106145764" watchObservedRunningTime="2026-03-19 15:30:34.879954783 +0000 UTC m=+894.108575975" Mar 19 15:30:35 crc kubenswrapper[4771]: I0319 15:30:35.227236 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-crmtc" Mar 19 15:30:35 crc kubenswrapper[4771]: I0319 15:30:35.369732 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37214cc0-ce69-44e2-80a0-ea1442abb9fa-catalog-content\") pod \"37214cc0-ce69-44e2-80a0-ea1442abb9fa\" (UID: \"37214cc0-ce69-44e2-80a0-ea1442abb9fa\") " Mar 19 15:30:35 crc kubenswrapper[4771]: I0319 15:30:35.369835 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76fg4\" (UniqueName: \"kubernetes.io/projected/37214cc0-ce69-44e2-80a0-ea1442abb9fa-kube-api-access-76fg4\") pod \"37214cc0-ce69-44e2-80a0-ea1442abb9fa\" (UID: \"37214cc0-ce69-44e2-80a0-ea1442abb9fa\") " Mar 19 15:30:35 crc kubenswrapper[4771]: I0319 15:30:35.369861 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37214cc0-ce69-44e2-80a0-ea1442abb9fa-utilities\") pod \"37214cc0-ce69-44e2-80a0-ea1442abb9fa\" (UID: \"37214cc0-ce69-44e2-80a0-ea1442abb9fa\") " Mar 19 15:30:35 crc kubenswrapper[4771]: I0319 15:30:35.370741 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37214cc0-ce69-44e2-80a0-ea1442abb9fa-utilities" (OuterVolumeSpecName: "utilities") pod "37214cc0-ce69-44e2-80a0-ea1442abb9fa" (UID: "37214cc0-ce69-44e2-80a0-ea1442abb9fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:30:35 crc kubenswrapper[4771]: I0319 15:30:35.374947 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37214cc0-ce69-44e2-80a0-ea1442abb9fa-kube-api-access-76fg4" (OuterVolumeSpecName: "kube-api-access-76fg4") pod "37214cc0-ce69-44e2-80a0-ea1442abb9fa" (UID: "37214cc0-ce69-44e2-80a0-ea1442abb9fa"). InnerVolumeSpecName "kube-api-access-76fg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:30:35 crc kubenswrapper[4771]: I0319 15:30:35.472048 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76fg4\" (UniqueName: \"kubernetes.io/projected/37214cc0-ce69-44e2-80a0-ea1442abb9fa-kube-api-access-76fg4\") on node \"crc\" DevicePath \"\"" Mar 19 15:30:35 crc kubenswrapper[4771]: I0319 15:30:35.472211 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37214cc0-ce69-44e2-80a0-ea1442abb9fa-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 15:30:35 crc kubenswrapper[4771]: I0319 15:30:35.496534 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37214cc0-ce69-44e2-80a0-ea1442abb9fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37214cc0-ce69-44e2-80a0-ea1442abb9fa" (UID: "37214cc0-ce69-44e2-80a0-ea1442abb9fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:30:35 crc kubenswrapper[4771]: I0319 15:30:35.573830 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37214cc0-ce69-44e2-80a0-ea1442abb9fa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 15:30:35 crc kubenswrapper[4771]: I0319 15:30:35.853536 4771 generic.go:334] "Generic (PLEG): container finished" podID="37214cc0-ce69-44e2-80a0-ea1442abb9fa" containerID="c49c670822eaf421db7e93cd26ddcce9e6753f823c980a200e8a96608a69132f" exitCode=0 Mar 19 15:30:35 crc kubenswrapper[4771]: I0319 15:30:35.853652 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-crmtc" Mar 19 15:30:35 crc kubenswrapper[4771]: I0319 15:30:35.853612 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crmtc" event={"ID":"37214cc0-ce69-44e2-80a0-ea1442abb9fa","Type":"ContainerDied","Data":"c49c670822eaf421db7e93cd26ddcce9e6753f823c980a200e8a96608a69132f"} Mar 19 15:30:35 crc kubenswrapper[4771]: I0319 15:30:35.853733 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crmtc" event={"ID":"37214cc0-ce69-44e2-80a0-ea1442abb9fa","Type":"ContainerDied","Data":"a2e8d5631ae933c188201b381d535cacf06f8079c676dee7a6d8d0b80ea10463"} Mar 19 15:30:35 crc kubenswrapper[4771]: I0319 15:30:35.853768 4771 scope.go:117] "RemoveContainer" containerID="c49c670822eaf421db7e93cd26ddcce9e6753f823c980a200e8a96608a69132f" Mar 19 15:30:35 crc kubenswrapper[4771]: I0319 15:30:35.874975 4771 scope.go:117] "RemoveContainer" containerID="0aa4cad78d603a5adceaec4501e00c1203c108a67ac78005a7402e000bb89c20" Mar 19 15:30:35 crc kubenswrapper[4771]: I0319 15:30:35.881249 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-crmtc"] Mar 19 15:30:35 crc kubenswrapper[4771]: I0319 15:30:35.893227 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-crmtc"] Mar 19 15:30:35 crc kubenswrapper[4771]: I0319 15:30:35.905760 4771 scope.go:117] "RemoveContainer" containerID="81d7224120ac277ee4dbf2910590850af9a54b5449f3444020c221a185e42641" Mar 19 15:30:35 crc kubenswrapper[4771]: I0319 15:30:35.929834 4771 scope.go:117] "RemoveContainer" containerID="c49c670822eaf421db7e93cd26ddcce9e6753f823c980a200e8a96608a69132f" Mar 19 15:30:35 crc kubenswrapper[4771]: E0319 15:30:35.930244 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c49c670822eaf421db7e93cd26ddcce9e6753f823c980a200e8a96608a69132f\": container with ID starting with c49c670822eaf421db7e93cd26ddcce9e6753f823c980a200e8a96608a69132f not found: ID does not exist" containerID="c49c670822eaf421db7e93cd26ddcce9e6753f823c980a200e8a96608a69132f" Mar 19 15:30:35 crc kubenswrapper[4771]: I0319 15:30:35.930296 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c49c670822eaf421db7e93cd26ddcce9e6753f823c980a200e8a96608a69132f"} err="failed to get container status \"c49c670822eaf421db7e93cd26ddcce9e6753f823c980a200e8a96608a69132f\": rpc error: code = NotFound desc = could not find container \"c49c670822eaf421db7e93cd26ddcce9e6753f823c980a200e8a96608a69132f\": container with ID starting with c49c670822eaf421db7e93cd26ddcce9e6753f823c980a200e8a96608a69132f not found: ID does not exist" Mar 19 15:30:35 crc kubenswrapper[4771]: I0319 15:30:35.930328 4771 scope.go:117] "RemoveContainer" containerID="0aa4cad78d603a5adceaec4501e00c1203c108a67ac78005a7402e000bb89c20" Mar 19 15:30:35 crc kubenswrapper[4771]: E0319 15:30:35.930867 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aa4cad78d603a5adceaec4501e00c1203c108a67ac78005a7402e000bb89c20\": container with ID starting with 0aa4cad78d603a5adceaec4501e00c1203c108a67ac78005a7402e000bb89c20 not found: ID does not exist" containerID="0aa4cad78d603a5adceaec4501e00c1203c108a67ac78005a7402e000bb89c20" Mar 19 15:30:35 crc kubenswrapper[4771]: I0319 15:30:35.930902 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aa4cad78d603a5adceaec4501e00c1203c108a67ac78005a7402e000bb89c20"} err="failed to get container status \"0aa4cad78d603a5adceaec4501e00c1203c108a67ac78005a7402e000bb89c20\": rpc error: code = NotFound desc = could not find container \"0aa4cad78d603a5adceaec4501e00c1203c108a67ac78005a7402e000bb89c20\": container with ID starting with 0aa4cad78d603a5adceaec4501e00c1203c108a67ac78005a7402e000bb89c20 not found: ID does not exist" Mar 19 15:30:35 crc kubenswrapper[4771]: I0319 15:30:35.930922 4771 scope.go:117] "RemoveContainer" containerID="81d7224120ac277ee4dbf2910590850af9a54b5449f3444020c221a185e42641" Mar 19 15:30:35 crc kubenswrapper[4771]: E0319 15:30:35.931235 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81d7224120ac277ee4dbf2910590850af9a54b5449f3444020c221a185e42641\": container with ID starting with 81d7224120ac277ee4dbf2910590850af9a54b5449f3444020c221a185e42641 not found: ID does not exist" containerID="81d7224120ac277ee4dbf2910590850af9a54b5449f3444020c221a185e42641" Mar 19 15:30:35 crc kubenswrapper[4771]: I0319 15:30:35.931287 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81d7224120ac277ee4dbf2910590850af9a54b5449f3444020c221a185e42641"} err="failed to get container status \"81d7224120ac277ee4dbf2910590850af9a54b5449f3444020c221a185e42641\": rpc error: code = NotFound desc = could not find container \"81d7224120ac277ee4dbf2910590850af9a54b5449f3444020c221a185e42641\": container with ID starting with 81d7224120ac277ee4dbf2910590850af9a54b5449f3444020c221a185e42641 not found: ID does not exist" Mar 19 15:30:36 crc kubenswrapper[4771]: I0319 15:30:36.873758 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z466c" event={"ID":"b72761bc-3ae8-4464-9544-d0ed1781f1e5","Type":"ContainerStarted","Data":"eb48411173bf419da517b932105de18d0f26343de44dd1927b50878f6a2657bb"} Mar 19 15:30:36 crc kubenswrapper[4771]: I0319 15:30:36.902827 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z466c" podStartSLOduration=1.909481848 podStartE2EDuration="7.902806533s" podCreationTimestamp="2026-03-19 15:30:29 +0000 UTC" firstStartedPulling="2026-03-19 15:30:29.865369768 +0000 UTC m=+889.093990970" lastFinishedPulling="2026-03-19 15:30:35.858694413 +0000 UTC m=+895.087315655" observedRunningTime="2026-03-19 15:30:36.899552411 +0000 UTC m=+896.128173643" watchObservedRunningTime="2026-03-19 15:30:36.902806533 +0000 UTC m=+896.131427735" Mar 19 15:30:37 crc kubenswrapper[4771]: I0319 15:30:37.522372 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37214cc0-ce69-44e2-80a0-ea1442abb9fa" path="/var/lib/kubelet/pods/37214cc0-ce69-44e2-80a0-ea1442abb9fa/volumes" Mar 19 15:30:39 crc kubenswrapper[4771]: I0319 15:30:39.460686 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-ctqt6" Mar 19 15:30:39 crc kubenswrapper[4771]: I0319 15:30:39.701692 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8f6c7759b-l576j" Mar 19 15:30:39 crc kubenswrapper[4771]: I0319 15:30:39.701776 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-8f6c7759b-l576j" Mar 19 15:30:39 crc kubenswrapper[4771]: I0319 15:30:39.710314 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8f6c7759b-l576j" Mar 19 15:30:39 crc kubenswrapper[4771]: I0319 15:30:39.904021 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8f6c7759b-l576j" Mar 19 15:30:39 crc kubenswrapper[4771]: I0319 15:30:39.994410 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-h97xq"] Mar 19 15:30:42 crc kubenswrapper[4771]: I0319 15:30:42.171388 4771 scope.go:117] "RemoveContainer" containerID="e2905b910893d2344f486917c4c4388c01a5395264f73c1be0ceca8922ed821c" Mar 19 15:30:50 crc kubenswrapper[4771]: I0319 15:30:50.041323 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m2mxb" Mar 19 15:31:03 crc kubenswrapper[4771]: I0319 15:31:03.681441 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp"] Mar 19 15:31:03 crc kubenswrapper[4771]: E0319 15:31:03.682645 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37214cc0-ce69-44e2-80a0-ea1442abb9fa" containerName="extract-content" Mar 19 15:31:03 crc kubenswrapper[4771]: I0319 15:31:03.682661 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="37214cc0-ce69-44e2-80a0-ea1442abb9fa" containerName="extract-content" Mar 19 15:31:03 crc kubenswrapper[4771]: E0319 15:31:03.682673 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37214cc0-ce69-44e2-80a0-ea1442abb9fa" containerName="registry-server" Mar 19 15:31:03 crc kubenswrapper[4771]: I0319 15:31:03.682680 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="37214cc0-ce69-44e2-80a0-ea1442abb9fa" containerName="registry-server" Mar 19 15:31:03 crc kubenswrapper[4771]: E0319 15:31:03.682696 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37214cc0-ce69-44e2-80a0-ea1442abb9fa" containerName="extract-utilities" Mar 19 15:31:03 crc kubenswrapper[4771]: I0319 15:31:03.682704 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="37214cc0-ce69-44e2-80a0-ea1442abb9fa" containerName="extract-utilities" Mar 19 15:31:03 crc kubenswrapper[4771]: I0319 15:31:03.682827 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="37214cc0-ce69-44e2-80a0-ea1442abb9fa" containerName="registry-server" Mar 19 15:31:03 crc kubenswrapper[4771]: I0319 15:31:03.683817 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp" Mar 19 15:31:03 crc kubenswrapper[4771]: I0319 15:31:03.686131 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 19 15:31:03 crc kubenswrapper[4771]: I0319 15:31:03.693326 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp"] Mar 19 15:31:03 crc kubenswrapper[4771]: I0319 15:31:03.829392 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dda7fbab-2dc2-4bb5-9106-2424eec739d8-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp\" (UID: \"dda7fbab-2dc2-4bb5-9106-2424eec739d8\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp" Mar 19 15:31:03 crc kubenswrapper[4771]: I0319 15:31:03.829448 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dda7fbab-2dc2-4bb5-9106-2424eec739d8-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp\" (UID: \"dda7fbab-2dc2-4bb5-9106-2424eec739d8\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp" Mar 19 15:31:03 crc kubenswrapper[4771]: I0319 15:31:03.829509 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62lqr\" (UniqueName: \"kubernetes.io/projected/dda7fbab-2dc2-4bb5-9106-2424eec739d8-kube-api-access-62lqr\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp\" (UID: \"dda7fbab-2dc2-4bb5-9106-2424eec739d8\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp" Mar 19 15:31:03 crc kubenswrapper[4771]: I0319 15:31:03.931074 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62lqr\" (UniqueName: \"kubernetes.io/projected/dda7fbab-2dc2-4bb5-9106-2424eec739d8-kube-api-access-62lqr\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp\" (UID: \"dda7fbab-2dc2-4bb5-9106-2424eec739d8\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp" Mar 19 15:31:03 crc kubenswrapper[4771]: I0319 15:31:03.931260 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dda7fbab-2dc2-4bb5-9106-2424eec739d8-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp\" (UID: \"dda7fbab-2dc2-4bb5-9106-2424eec739d8\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp" Mar 19 15:31:03 crc kubenswrapper[4771]: I0319 15:31:03.931349 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dda7fbab-2dc2-4bb5-9106-2424eec739d8-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp\" (UID: \"dda7fbab-2dc2-4bb5-9106-2424eec739d8\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp" Mar 19 15:31:03 crc kubenswrapper[4771]: I0319 15:31:03.932160 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dda7fbab-2dc2-4bb5-9106-2424eec739d8-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp\" (UID: \"dda7fbab-2dc2-4bb5-9106-2424eec739d8\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp" Mar 19 15:31:03 crc kubenswrapper[4771]: I0319 15:31:03.932348 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dda7fbab-2dc2-4bb5-9106-2424eec739d8-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp\" (UID: \"dda7fbab-2dc2-4bb5-9106-2424eec739d8\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp" Mar 19 15:31:03 crc kubenswrapper[4771]: I0319 15:31:03.957501 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62lqr\" (UniqueName: \"kubernetes.io/projected/dda7fbab-2dc2-4bb5-9106-2424eec739d8-kube-api-access-62lqr\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp\" (UID: \"dda7fbab-2dc2-4bb5-9106-2424eec739d8\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp" Mar 19 15:31:04 crc kubenswrapper[4771]: I0319 15:31:04.011096 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp" Mar 19 15:31:04 crc kubenswrapper[4771]: I0319 15:31:04.264825 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp"] Mar 19 15:31:04 crc kubenswrapper[4771]: I0319 15:31:04.422021 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp" event={"ID":"dda7fbab-2dc2-4bb5-9106-2424eec739d8","Type":"ContainerStarted","Data":"a24ac28dd75fdba7904a63b99ab23687f51b26d67feda3f2a0a38096a47a33de"} Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.051171 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-h97xq" podUID="6dc754c0-8f17-402b-9bd4-be033eb940ba" containerName="console" containerID="cri-o://5ce486487bcb8cdb2522de50d8410760e3f65be7ffe691bb5c05dfbba5142e72" gracePeriod=15 Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.431850 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-h97xq_6dc754c0-8f17-402b-9bd4-be033eb940ba/console/0.log" Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.431898 4771 generic.go:334] "Generic (PLEG): container finished" podID="6dc754c0-8f17-402b-9bd4-be033eb940ba" containerID="5ce486487bcb8cdb2522de50d8410760e3f65be7ffe691bb5c05dfbba5142e72" exitCode=2 Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.431958 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-h97xq" event={"ID":"6dc754c0-8f17-402b-9bd4-be033eb940ba","Type":"ContainerDied","Data":"5ce486487bcb8cdb2522de50d8410760e3f65be7ffe691bb5c05dfbba5142e72"} Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.434710 4771 generic.go:334] "Generic (PLEG): container finished" podID="dda7fbab-2dc2-4bb5-9106-2424eec739d8" containerID="e02e9584c290c8cbf193c1bb54f157338634fcb6bb434dc3006770c7a3079a44" exitCode=0 Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.434769 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp" event={"ID":"dda7fbab-2dc2-4bb5-9106-2424eec739d8","Type":"ContainerDied","Data":"e02e9584c290c8cbf193c1bb54f157338634fcb6bb434dc3006770c7a3079a44"} Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.437474 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.530799 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-h97xq_6dc754c0-8f17-402b-9bd4-be033eb940ba/console/0.log" Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.530906 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-h97xq" Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.657713 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6dc754c0-8f17-402b-9bd4-be033eb940ba-oauth-serving-cert\") pod \"6dc754c0-8f17-402b-9bd4-be033eb940ba\" (UID: \"6dc754c0-8f17-402b-9bd4-be033eb940ba\") " Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.657834 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6dc754c0-8f17-402b-9bd4-be033eb940ba-console-serving-cert\") pod \"6dc754c0-8f17-402b-9bd4-be033eb940ba\" (UID: \"6dc754c0-8f17-402b-9bd4-be033eb940ba\") " Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.657868 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6dc754c0-8f17-402b-9bd4-be033eb940ba-trusted-ca-bundle\") pod \"6dc754c0-8f17-402b-9bd4-be033eb940ba\" (UID: \"6dc754c0-8f17-402b-9bd4-be033eb940ba\") " Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.657903 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6dc754c0-8f17-402b-9bd4-be033eb940ba-console-oauth-config\") pod \"6dc754c0-8f17-402b-9bd4-be033eb940ba\" (UID: \"6dc754c0-8f17-402b-9bd4-be033eb940ba\") " Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.657951 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6dc754c0-8f17-402b-9bd4-be033eb940ba-console-config\") pod \"6dc754c0-8f17-402b-9bd4-be033eb940ba\" (UID: \"6dc754c0-8f17-402b-9bd4-be033eb940ba\") " Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.658059 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4xcw\" (UniqueName: \"kubernetes.io/projected/6dc754c0-8f17-402b-9bd4-be033eb940ba-kube-api-access-c4xcw\") pod \"6dc754c0-8f17-402b-9bd4-be033eb940ba\" (UID: \"6dc754c0-8f17-402b-9bd4-be033eb940ba\") " Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.658097 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6dc754c0-8f17-402b-9bd4-be033eb940ba-service-ca\") pod \"6dc754c0-8f17-402b-9bd4-be033eb940ba\" (UID: \"6dc754c0-8f17-402b-9bd4-be033eb940ba\") " Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.658898 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dc754c0-8f17-402b-9bd4-be033eb940ba-console-config" (OuterVolumeSpecName: "console-config") pod "6dc754c0-8f17-402b-9bd4-be033eb940ba" (UID: "6dc754c0-8f17-402b-9bd4-be033eb940ba"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.658902 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dc754c0-8f17-402b-9bd4-be033eb940ba-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6dc754c0-8f17-402b-9bd4-be033eb940ba" (UID: "6dc754c0-8f17-402b-9bd4-be033eb940ba"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.658951 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dc754c0-8f17-402b-9bd4-be033eb940ba-service-ca" (OuterVolumeSpecName: "service-ca") pod "6dc754c0-8f17-402b-9bd4-be033eb940ba" (UID: "6dc754c0-8f17-402b-9bd4-be033eb940ba"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.659015 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dc754c0-8f17-402b-9bd4-be033eb940ba-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6dc754c0-8f17-402b-9bd4-be033eb940ba" (UID: "6dc754c0-8f17-402b-9bd4-be033eb940ba"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.665606 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dc754c0-8f17-402b-9bd4-be033eb940ba-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6dc754c0-8f17-402b-9bd4-be033eb940ba" (UID: "6dc754c0-8f17-402b-9bd4-be033eb940ba"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.669597 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dc754c0-8f17-402b-9bd4-be033eb940ba-kube-api-access-c4xcw" (OuterVolumeSpecName: "kube-api-access-c4xcw") pod "6dc754c0-8f17-402b-9bd4-be033eb940ba" (UID: "6dc754c0-8f17-402b-9bd4-be033eb940ba"). InnerVolumeSpecName "kube-api-access-c4xcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.672375 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dc754c0-8f17-402b-9bd4-be033eb940ba-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6dc754c0-8f17-402b-9bd4-be033eb940ba" (UID: "6dc754c0-8f17-402b-9bd4-be033eb940ba"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.759644 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4xcw\" (UniqueName: \"kubernetes.io/projected/6dc754c0-8f17-402b-9bd4-be033eb940ba-kube-api-access-c4xcw\") on node \"crc\" DevicePath \"\"" Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.759701 4771 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6dc754c0-8f17-402b-9bd4-be033eb940ba-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.759723 4771 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6dc754c0-8f17-402b-9bd4-be033eb940ba-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.759740 4771 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6dc754c0-8f17-402b-9bd4-be033eb940ba-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.759757 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6dc754c0-8f17-402b-9bd4-be033eb940ba-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.759774 4771 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6dc754c0-8f17-402b-9bd4-be033eb940ba-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:31:05 crc kubenswrapper[4771]: I0319 15:31:05.759790 4771 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6dc754c0-8f17-402b-9bd4-be033eb940ba-console-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:31:06 crc kubenswrapper[4771]: I0319 15:31:06.446938 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-h97xq_6dc754c0-8f17-402b-9bd4-be033eb940ba/console/0.log" Mar 19 15:31:06 crc kubenswrapper[4771]: I0319 15:31:06.447060 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-h97xq" event={"ID":"6dc754c0-8f17-402b-9bd4-be033eb940ba","Type":"ContainerDied","Data":"897383fea6d9e7358093ca896f88dc9dcba5008acfd0c691f40579af3bb59057"} Mar 19 15:31:06 crc kubenswrapper[4771]: I0319 15:31:06.447126 4771 scope.go:117] "RemoveContainer" containerID="5ce486487bcb8cdb2522de50d8410760e3f65be7ffe691bb5c05dfbba5142e72" Mar 19 15:31:06 crc kubenswrapper[4771]: I0319 15:31:06.447176 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-h97xq" Mar 19 15:31:06 crc kubenswrapper[4771]: I0319 15:31:06.495435 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-h97xq"] Mar 19 15:31:06 crc kubenswrapper[4771]: I0319 15:31:06.500819 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-h97xq"] Mar 19 15:31:07 crc kubenswrapper[4771]: I0319 15:31:07.518780 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dc754c0-8f17-402b-9bd4-be033eb940ba" path="/var/lib/kubelet/pods/6dc754c0-8f17-402b-9bd4-be033eb940ba/volumes" Mar 19 15:31:08 crc kubenswrapper[4771]: I0319 15:31:08.470841 4771 generic.go:334] "Generic (PLEG): container finished" podID="dda7fbab-2dc2-4bb5-9106-2424eec739d8" containerID="de74d94f228effc90e588db3bd260beb7abe72cdd0073870e2ed7376ee28157e" exitCode=0 Mar 19 15:31:08 crc kubenswrapper[4771]: I0319 15:31:08.470875 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp" event={"ID":"dda7fbab-2dc2-4bb5-9106-2424eec739d8","Type":"ContainerDied","Data":"de74d94f228effc90e588db3bd260beb7abe72cdd0073870e2ed7376ee28157e"} Mar 19 15:31:09 crc kubenswrapper[4771]: I0319 15:31:09.483239 4771 generic.go:334] "Generic (PLEG): container finished" podID="dda7fbab-2dc2-4bb5-9106-2424eec739d8" containerID="c55c56c441e2926e5a30ce8f9546d16c2cc31e92266554936e4364d606f22d51" exitCode=0 Mar 19 15:31:09 crc kubenswrapper[4771]: I0319 15:31:09.483307 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp" event={"ID":"dda7fbab-2dc2-4bb5-9106-2424eec739d8","Type":"ContainerDied","Data":"c55c56c441e2926e5a30ce8f9546d16c2cc31e92266554936e4364d606f22d51"} Mar 19 15:31:10 crc kubenswrapper[4771]: I0319 15:31:10.744295 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp" Mar 19 15:31:10 crc kubenswrapper[4771]: I0319 15:31:10.844137 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62lqr\" (UniqueName: \"kubernetes.io/projected/dda7fbab-2dc2-4bb5-9106-2424eec739d8-kube-api-access-62lqr\") pod \"dda7fbab-2dc2-4bb5-9106-2424eec739d8\" (UID: \"dda7fbab-2dc2-4bb5-9106-2424eec739d8\") " Mar 19 15:31:10 crc kubenswrapper[4771]: I0319 15:31:10.844211 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dda7fbab-2dc2-4bb5-9106-2424eec739d8-bundle\") pod \"dda7fbab-2dc2-4bb5-9106-2424eec739d8\" (UID: \"dda7fbab-2dc2-4bb5-9106-2424eec739d8\") " Mar 19 15:31:10 crc kubenswrapper[4771]: I0319 15:31:10.844243 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dda7fbab-2dc2-4bb5-9106-2424eec739d8-util\") pod \"dda7fbab-2dc2-4bb5-9106-2424eec739d8\" (UID: \"dda7fbab-2dc2-4bb5-9106-2424eec739d8\") " Mar 19 15:31:10 crc kubenswrapper[4771]: I0319 15:31:10.846341 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dda7fbab-2dc2-4bb5-9106-2424eec739d8-bundle" (OuterVolumeSpecName: "bundle") pod "dda7fbab-2dc2-4bb5-9106-2424eec739d8" (UID: "dda7fbab-2dc2-4bb5-9106-2424eec739d8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:31:10 crc kubenswrapper[4771]: I0319 15:31:10.854182 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dda7fbab-2dc2-4bb5-9106-2424eec739d8-kube-api-access-62lqr" (OuterVolumeSpecName: "kube-api-access-62lqr") pod "dda7fbab-2dc2-4bb5-9106-2424eec739d8" (UID: "dda7fbab-2dc2-4bb5-9106-2424eec739d8"). InnerVolumeSpecName "kube-api-access-62lqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:31:10 crc kubenswrapper[4771]: I0319 15:31:10.855694 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dda7fbab-2dc2-4bb5-9106-2424eec739d8-util" (OuterVolumeSpecName: "util") pod "dda7fbab-2dc2-4bb5-9106-2424eec739d8" (UID: "dda7fbab-2dc2-4bb5-9106-2424eec739d8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:31:10 crc kubenswrapper[4771]: I0319 15:31:10.945590 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62lqr\" (UniqueName: \"kubernetes.io/projected/dda7fbab-2dc2-4bb5-9106-2424eec739d8-kube-api-access-62lqr\") on node \"crc\" DevicePath \"\"" Mar 19 15:31:10 crc kubenswrapper[4771]: I0319 15:31:10.945639 4771 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dda7fbab-2dc2-4bb5-9106-2424eec739d8-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 15:31:10 crc kubenswrapper[4771]: I0319 15:31:10.945665 4771 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dda7fbab-2dc2-4bb5-9106-2424eec739d8-util\") on node \"crc\" DevicePath \"\"" Mar 19 15:31:11 crc kubenswrapper[4771]: I0319 15:31:11.501572 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp" event={"ID":"dda7fbab-2dc2-4bb5-9106-2424eec739d8","Type":"ContainerDied","Data":"a24ac28dd75fdba7904a63b99ab23687f51b26d67feda3f2a0a38096a47a33de"} Mar 19 15:31:11 crc kubenswrapper[4771]: I0319 15:31:11.501616 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a24ac28dd75fdba7904a63b99ab23687f51b26d67feda3f2a0a38096a47a33de" Mar 19 15:31:11 crc kubenswrapper[4771]: I0319 15:31:11.501639 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.002933 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-cc6bc498-fn6sc"] Mar 19 15:31:19 crc kubenswrapper[4771]: E0319 15:31:19.003655 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dda7fbab-2dc2-4bb5-9106-2424eec739d8" containerName="extract" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.003667 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="dda7fbab-2dc2-4bb5-9106-2424eec739d8" containerName="extract" Mar 19 15:31:19 crc kubenswrapper[4771]: E0319 15:31:19.003680 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dda7fbab-2dc2-4bb5-9106-2424eec739d8" containerName="pull" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.003686 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="dda7fbab-2dc2-4bb5-9106-2424eec739d8" containerName="pull" Mar 19 15:31:19 crc kubenswrapper[4771]: E0319 15:31:19.003699 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dda7fbab-2dc2-4bb5-9106-2424eec739d8" containerName="util" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.003706 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="dda7fbab-2dc2-4bb5-9106-2424eec739d8" containerName="util" Mar 19 15:31:19 crc kubenswrapper[4771]: E0319 15:31:19.003719 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dc754c0-8f17-402b-9bd4-be033eb940ba" containerName="console" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.003726 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dc754c0-8f17-402b-9bd4-be033eb940ba" containerName="console" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.003828 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="dda7fbab-2dc2-4bb5-9106-2424eec739d8" containerName="extract" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.003848 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dc754c0-8f17-402b-9bd4-be033eb940ba" containerName="console" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.004267 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-cc6bc498-fn6sc" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.007436 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.007484 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-qcltn" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.007558 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.007436 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.007448 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.029213 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-cc6bc498-fn6sc"] Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.154041 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grqmw\" (UniqueName: \"kubernetes.io/projected/6a6a1f64-4048-41cf-a2c6-19eb960fa8ae-kube-api-access-grqmw\") pod \"metallb-operator-controller-manager-cc6bc498-fn6sc\" (UID: \"6a6a1f64-4048-41cf-a2c6-19eb960fa8ae\") " pod="metallb-system/metallb-operator-controller-manager-cc6bc498-fn6sc" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.154110 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a6a1f64-4048-41cf-a2c6-19eb960fa8ae-apiservice-cert\") pod \"metallb-operator-controller-manager-cc6bc498-fn6sc\" (UID: \"6a6a1f64-4048-41cf-a2c6-19eb960fa8ae\") " pod="metallb-system/metallb-operator-controller-manager-cc6bc498-fn6sc" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.154132 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a6a1f64-4048-41cf-a2c6-19eb960fa8ae-webhook-cert\") pod \"metallb-operator-controller-manager-cc6bc498-fn6sc\" (UID: \"6a6a1f64-4048-41cf-a2c6-19eb960fa8ae\") " pod="metallb-system/metallb-operator-controller-manager-cc6bc498-fn6sc" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.255402 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a6a1f64-4048-41cf-a2c6-19eb960fa8ae-apiservice-cert\") pod \"metallb-operator-controller-manager-cc6bc498-fn6sc\" (UID: \"6a6a1f64-4048-41cf-a2c6-19eb960fa8ae\") " pod="metallb-system/metallb-operator-controller-manager-cc6bc498-fn6sc" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.255444 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a6a1f64-4048-41cf-a2c6-19eb960fa8ae-webhook-cert\") pod \"metallb-operator-controller-manager-cc6bc498-fn6sc\" (UID: \"6a6a1f64-4048-41cf-a2c6-19eb960fa8ae\") " pod="metallb-system/metallb-operator-controller-manager-cc6bc498-fn6sc" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.255508 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grqmw\" (UniqueName: \"kubernetes.io/projected/6a6a1f64-4048-41cf-a2c6-19eb960fa8ae-kube-api-access-grqmw\") pod \"metallb-operator-controller-manager-cc6bc498-fn6sc\" (UID: \"6a6a1f64-4048-41cf-a2c6-19eb960fa8ae\") " pod="metallb-system/metallb-operator-controller-manager-cc6bc498-fn6sc" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.272347 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a6a1f64-4048-41cf-a2c6-19eb960fa8ae-apiservice-cert\") pod \"metallb-operator-controller-manager-cc6bc498-fn6sc\" (UID: \"6a6a1f64-4048-41cf-a2c6-19eb960fa8ae\") " pod="metallb-system/metallb-operator-controller-manager-cc6bc498-fn6sc" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.276377 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grqmw\" (UniqueName: \"kubernetes.io/projected/6a6a1f64-4048-41cf-a2c6-19eb960fa8ae-kube-api-access-grqmw\") pod \"metallb-operator-controller-manager-cc6bc498-fn6sc\" (UID: \"6a6a1f64-4048-41cf-a2c6-19eb960fa8ae\") " pod="metallb-system/metallb-operator-controller-manager-cc6bc498-fn6sc" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.277804 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a6a1f64-4048-41cf-a2c6-19eb960fa8ae-webhook-cert\") pod \"metallb-operator-controller-manager-cc6bc498-fn6sc\" (UID: \"6a6a1f64-4048-41cf-a2c6-19eb960fa8ae\") " pod="metallb-system/metallb-operator-controller-manager-cc6bc498-fn6sc" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.321009 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-cc6bc498-fn6sc" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.390690 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7597b4dfd5-tz6lb"] Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.391702 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7597b4dfd5-tz6lb" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.393447 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-5qfzt" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.393631 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.393653 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.479382 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7597b4dfd5-tz6lb"] Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.560014 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5fxn\" (UniqueName: \"kubernetes.io/projected/789091c5-c870-4a79-ab5c-8e42cf14c768-kube-api-access-b5fxn\") pod \"metallb-operator-webhook-server-7597b4dfd5-tz6lb\" (UID: \"789091c5-c870-4a79-ab5c-8e42cf14c768\") " pod="metallb-system/metallb-operator-webhook-server-7597b4dfd5-tz6lb" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.560100 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/789091c5-c870-4a79-ab5c-8e42cf14c768-apiservice-cert\") pod \"metallb-operator-webhook-server-7597b4dfd5-tz6lb\" (UID: \"789091c5-c870-4a79-ab5c-8e42cf14c768\") " pod="metallb-system/metallb-operator-webhook-server-7597b4dfd5-tz6lb" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.560151 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/789091c5-c870-4a79-ab5c-8e42cf14c768-webhook-cert\") pod \"metallb-operator-webhook-server-7597b4dfd5-tz6lb\" (UID: \"789091c5-c870-4a79-ab5c-8e42cf14c768\") " pod="metallb-system/metallb-operator-webhook-server-7597b4dfd5-tz6lb" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.576185 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-cc6bc498-fn6sc"] Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.661565 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/789091c5-c870-4a79-ab5c-8e42cf14c768-webhook-cert\") pod \"metallb-operator-webhook-server-7597b4dfd5-tz6lb\" (UID: \"789091c5-c870-4a79-ab5c-8e42cf14c768\") " pod="metallb-system/metallb-operator-webhook-server-7597b4dfd5-tz6lb" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.661660 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5fxn\" (UniqueName: \"kubernetes.io/projected/789091c5-c870-4a79-ab5c-8e42cf14c768-kube-api-access-b5fxn\") pod \"metallb-operator-webhook-server-7597b4dfd5-tz6lb\" (UID: \"789091c5-c870-4a79-ab5c-8e42cf14c768\") " pod="metallb-system/metallb-operator-webhook-server-7597b4dfd5-tz6lb" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.661710 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/789091c5-c870-4a79-ab5c-8e42cf14c768-apiservice-cert\") pod \"metallb-operator-webhook-server-7597b4dfd5-tz6lb\" (UID: \"789091c5-c870-4a79-ab5c-8e42cf14c768\") " pod="metallb-system/metallb-operator-webhook-server-7597b4dfd5-tz6lb" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.666891 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/789091c5-c870-4a79-ab5c-8e42cf14c768-apiservice-cert\") pod \"metallb-operator-webhook-server-7597b4dfd5-tz6lb\" (UID: \"789091c5-c870-4a79-ab5c-8e42cf14c768\") " pod="metallb-system/metallb-operator-webhook-server-7597b4dfd5-tz6lb" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.666906 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/789091c5-c870-4a79-ab5c-8e42cf14c768-webhook-cert\") pod \"metallb-operator-webhook-server-7597b4dfd5-tz6lb\" (UID: \"789091c5-c870-4a79-ab5c-8e42cf14c768\") " pod="metallb-system/metallb-operator-webhook-server-7597b4dfd5-tz6lb" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.679066 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5fxn\" (UniqueName: \"kubernetes.io/projected/789091c5-c870-4a79-ab5c-8e42cf14c768-kube-api-access-b5fxn\") pod \"metallb-operator-webhook-server-7597b4dfd5-tz6lb\" (UID: \"789091c5-c870-4a79-ab5c-8e42cf14c768\") " pod="metallb-system/metallb-operator-webhook-server-7597b4dfd5-tz6lb" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.707596 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7597b4dfd5-tz6lb" Mar 19 15:31:19 crc kubenswrapper[4771]: I0319 15:31:19.947350 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7597b4dfd5-tz6lb"] Mar 19 15:31:19 crc kubenswrapper[4771]: W0319 15:31:19.959636 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod789091c5_c870_4a79_ab5c_8e42cf14c768.slice/crio-042a02446eea9e6fd62b35e4f8d8a91153d83f70ae4a1dd2a59334e4346fadfb WatchSource:0}: Error finding container 042a02446eea9e6fd62b35e4f8d8a91153d83f70ae4a1dd2a59334e4346fadfb: Status 404 returned error can't find the container with id 042a02446eea9e6fd62b35e4f8d8a91153d83f70ae4a1dd2a59334e4346fadfb Mar 19 15:31:20 crc kubenswrapper[4771]: I0319 15:31:20.556696 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7597b4dfd5-tz6lb" event={"ID":"789091c5-c870-4a79-ab5c-8e42cf14c768","Type":"ContainerStarted","Data":"042a02446eea9e6fd62b35e4f8d8a91153d83f70ae4a1dd2a59334e4346fadfb"} Mar 19 15:31:20 crc kubenswrapper[4771]: I0319 15:31:20.558381 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-cc6bc498-fn6sc" event={"ID":"6a6a1f64-4048-41cf-a2c6-19eb960fa8ae","Type":"ContainerStarted","Data":"7e81456d3d0c86336a5d7d6a2741ab6ae93b16020794b36f08249aaaebe3a1f6"} Mar 19 15:31:25 crc kubenswrapper[4771]: I0319 15:31:25.600119 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7597b4dfd5-tz6lb" event={"ID":"789091c5-c870-4a79-ab5c-8e42cf14c768","Type":"ContainerStarted","Data":"9d4e49cab766c73e73d855c35039dbf403cb99d46c97b0741f94c31873fec0b4"} Mar 19 15:31:25 crc kubenswrapper[4771]: I0319 15:31:25.600780 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7597b4dfd5-tz6lb" Mar 19 15:31:25 crc kubenswrapper[4771]: I0319 15:31:25.602094 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-cc6bc498-fn6sc" event={"ID":"6a6a1f64-4048-41cf-a2c6-19eb960fa8ae","Type":"ContainerStarted","Data":"3b5fd9b6c24407975f56ef559c43f00740d77b04df8ca43708cf17d7f5425b2d"} Mar 19 15:31:25 crc kubenswrapper[4771]: I0319 15:31:25.602655 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-cc6bc498-fn6sc" Mar 19 15:31:25 crc kubenswrapper[4771]: I0319 15:31:25.626282 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7597b4dfd5-tz6lb" podStartSLOduration=1.170987712 podStartE2EDuration="6.6262653s" podCreationTimestamp="2026-03-19 15:31:19 +0000 UTC" firstStartedPulling="2026-03-19 15:31:19.963161143 +0000 UTC m=+939.191782345" lastFinishedPulling="2026-03-19 15:31:25.418438731 +0000 UTC m=+944.647059933" observedRunningTime="2026-03-19 15:31:25.619845139 +0000 UTC m=+944.848466351" watchObservedRunningTime="2026-03-19 15:31:25.6262653 +0000 UTC m=+944.854886512" Mar 19 15:31:25 crc kubenswrapper[4771]: I0319 15:31:25.649622 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-cc6bc498-fn6sc" podStartSLOduration=1.845397611 podStartE2EDuration="7.649601964s" podCreationTimestamp="2026-03-19 15:31:18 +0000 UTC" firstStartedPulling="2026-03-19 15:31:19.594614466 +0000 UTC m=+938.823235668" lastFinishedPulling="2026-03-19 15:31:25.398818819 +0000 UTC m=+944.627440021" observedRunningTime="2026-03-19 15:31:25.646940247 +0000 UTC m=+944.875561459" watchObservedRunningTime="2026-03-19 15:31:25.649601964 +0000 UTC m=+944.878223166" Mar 19 15:31:39 crc kubenswrapper[4771]: I0319 15:31:39.711483 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7597b4dfd5-tz6lb" Mar 19 15:31:49 crc kubenswrapper[4771]: I0319 15:31:49.413115 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rjgb8"] Mar 19 15:31:49 crc kubenswrapper[4771]: I0319 15:31:49.416495 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rjgb8" Mar 19 15:31:49 crc kubenswrapper[4771]: I0319 15:31:49.442928 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjgb8"] Mar 19 15:31:49 crc kubenswrapper[4771]: I0319 15:31:49.477916 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d339b8d5-d6f0-46f3-8505-191ce6754fa0-catalog-content\") pod \"redhat-marketplace-rjgb8\" (UID: \"d339b8d5-d6f0-46f3-8505-191ce6754fa0\") " pod="openshift-marketplace/redhat-marketplace-rjgb8" Mar 19 15:31:49 crc kubenswrapper[4771]: I0319 15:31:49.478424 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c5jb\" (UniqueName: \"kubernetes.io/projected/d339b8d5-d6f0-46f3-8505-191ce6754fa0-kube-api-access-9c5jb\") pod \"redhat-marketplace-rjgb8\" (UID: \"d339b8d5-d6f0-46f3-8505-191ce6754fa0\") " pod="openshift-marketplace/redhat-marketplace-rjgb8" Mar 19 15:31:49 crc kubenswrapper[4771]: I0319 15:31:49.478559 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d339b8d5-d6f0-46f3-8505-191ce6754fa0-utilities\") pod \"redhat-marketplace-rjgb8\" (UID: \"d339b8d5-d6f0-46f3-8505-191ce6754fa0\") " pod="openshift-marketplace/redhat-marketplace-rjgb8" Mar 19 15:31:49 crc kubenswrapper[4771]: I0319 15:31:49.580220 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d339b8d5-d6f0-46f3-8505-191ce6754fa0-catalog-content\") pod \"redhat-marketplace-rjgb8\" (UID: \"d339b8d5-d6f0-46f3-8505-191ce6754fa0\") " pod="openshift-marketplace/redhat-marketplace-rjgb8" Mar 19 15:31:49 crc kubenswrapper[4771]: I0319 15:31:49.580317 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c5jb\" (UniqueName: \"kubernetes.io/projected/d339b8d5-d6f0-46f3-8505-191ce6754fa0-kube-api-access-9c5jb\") pod \"redhat-marketplace-rjgb8\" (UID: \"d339b8d5-d6f0-46f3-8505-191ce6754fa0\") " pod="openshift-marketplace/redhat-marketplace-rjgb8" Mar 19 15:31:49 crc kubenswrapper[4771]: I0319 15:31:49.580386 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d339b8d5-d6f0-46f3-8505-191ce6754fa0-utilities\") pod \"redhat-marketplace-rjgb8\" (UID: \"d339b8d5-d6f0-46f3-8505-191ce6754fa0\") " pod="openshift-marketplace/redhat-marketplace-rjgb8" Mar 19 15:31:49 crc kubenswrapper[4771]: I0319 15:31:49.580844 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d339b8d5-d6f0-46f3-8505-191ce6754fa0-catalog-content\") pod \"redhat-marketplace-rjgb8\" (UID: \"d339b8d5-d6f0-46f3-8505-191ce6754fa0\") " pod="openshift-marketplace/redhat-marketplace-rjgb8" Mar 19 15:31:49 crc kubenswrapper[4771]: I0319 15:31:49.580928 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d339b8d5-d6f0-46f3-8505-191ce6754fa0-utilities\") pod \"redhat-marketplace-rjgb8\" (UID: \"d339b8d5-d6f0-46f3-8505-191ce6754fa0\") " pod="openshift-marketplace/redhat-marketplace-rjgb8" Mar 19 15:31:49 crc kubenswrapper[4771]: I0319 15:31:49.602623 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c5jb\" (UniqueName: \"kubernetes.io/projected/d339b8d5-d6f0-46f3-8505-191ce6754fa0-kube-api-access-9c5jb\") pod \"redhat-marketplace-rjgb8\" (UID: \"d339b8d5-d6f0-46f3-8505-191ce6754fa0\") " pod="openshift-marketplace/redhat-marketplace-rjgb8" Mar 19 15:31:49 crc kubenswrapper[4771]: I0319 15:31:49.755712 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rjgb8" Mar 19 15:31:49 crc kubenswrapper[4771]: I0319 15:31:49.967832 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjgb8"] Mar 19 15:31:50 crc kubenswrapper[4771]: I0319 15:31:50.769432 4771 generic.go:334] "Generic (PLEG): container finished" podID="d339b8d5-d6f0-46f3-8505-191ce6754fa0" containerID="250993272cf1f5aae9524b435e7878e3895805b1e2c40b7b7f7cef0bdb293f48" exitCode=0 Mar 19 15:31:50 crc kubenswrapper[4771]: I0319 15:31:50.769514 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjgb8" event={"ID":"d339b8d5-d6f0-46f3-8505-191ce6754fa0","Type":"ContainerDied","Data":"250993272cf1f5aae9524b435e7878e3895805b1e2c40b7b7f7cef0bdb293f48"} Mar 19 15:31:50 crc kubenswrapper[4771]: I0319 15:31:50.769792 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjgb8" event={"ID":"d339b8d5-d6f0-46f3-8505-191ce6754fa0","Type":"ContainerStarted","Data":"2d8035110d5e5fa3b475002bcb165085b40fc0a457b8980e3e01578b95c873f6"} Mar 19 15:31:52 crc kubenswrapper[4771]: I0319 15:31:52.793020 4771 generic.go:334] "Generic (PLEG): container finished" podID="d339b8d5-d6f0-46f3-8505-191ce6754fa0" containerID="f7140faf79782235f5b52b8117b82c9d75a532761fa713bbfd00ddb7c7a2df87" exitCode=0 Mar 19 15:31:52 crc kubenswrapper[4771]: I0319 15:31:52.793092 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjgb8" event={"ID":"d339b8d5-d6f0-46f3-8505-191ce6754fa0","Type":"ContainerDied","Data":"f7140faf79782235f5b52b8117b82c9d75a532761fa713bbfd00ddb7c7a2df87"} Mar 19 15:31:53 crc kubenswrapper[4771]: I0319 15:31:53.801845 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjgb8" event={"ID":"d339b8d5-d6f0-46f3-8505-191ce6754fa0","Type":"ContainerStarted","Data":"8fb0fa4d9d811d8e97a9f041abc5d5d370aace3e0ce94a3bb02f112f85a02d7a"} Mar 19 15:31:53 crc kubenswrapper[4771]: I0319 15:31:53.830267 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rjgb8" podStartSLOduration=2.135488925 podStartE2EDuration="4.830248634s" podCreationTimestamp="2026-03-19 15:31:49 +0000 UTC" firstStartedPulling="2026-03-19 15:31:50.772966189 +0000 UTC m=+970.001587441" lastFinishedPulling="2026-03-19 15:31:53.467725948 +0000 UTC m=+972.696347150" observedRunningTime="2026-03-19 15:31:53.82925872 +0000 UTC m=+973.057879922" watchObservedRunningTime="2026-03-19 15:31:53.830248634 +0000 UTC m=+973.058869836" Mar 19 15:31:59 crc kubenswrapper[4771]: I0319 15:31:59.383623 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-cc6bc498-fn6sc" Mar 19 15:31:59 crc kubenswrapper[4771]: I0319 15:31:59.757177 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rjgb8" Mar 19 15:31:59 crc kubenswrapper[4771]: I0319 15:31:59.757246 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rjgb8" Mar 19 15:31:59 crc kubenswrapper[4771]: I0319 15:31:59.807716 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rjgb8" Mar 19 15:31:59 crc kubenswrapper[4771]: I0319 15:31:59.889008 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rjgb8" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.066265 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjgb8"] Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.113807 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-z4zw5"] Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.129788 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-9mlqc"] Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.130725 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-9mlqc" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.131364 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-z4zw5" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.138808 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.139049 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.139305 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.139672 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-x8ztv" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.139831 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-9mlqc"] Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.198334 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565572-h8tvp"] Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.199391 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565572-h8tvp" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.206785 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k42k7" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.209436 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565572-h8tvp"] Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.210343 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.210515 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.226927 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-vltvb"] Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.228170 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-vltvb" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.229086 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-vd9n2"] Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.229814 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-vd9n2" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.232603 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-44vzv" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.232829 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.232952 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.233137 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.233275 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.234650 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/775dcd57-e7c4-416d-a758-a5bb4ccc74ab-metrics-certs\") pod \"frr-k8s-z4zw5\" (UID: \"775dcd57-e7c4-416d-a758-a5bb4ccc74ab\") " pod="metallb-system/frr-k8s-z4zw5" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.234700 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/775dcd57-e7c4-416d-a758-a5bb4ccc74ab-frr-conf\") pod \"frr-k8s-z4zw5\" (UID: \"775dcd57-e7c4-416d-a758-a5bb4ccc74ab\") " pod="metallb-system/frr-k8s-z4zw5" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.234725 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/775dcd57-e7c4-416d-a758-a5bb4ccc74ab-reloader\") pod \"frr-k8s-z4zw5\" (UID: \"775dcd57-e7c4-416d-a758-a5bb4ccc74ab\") " pod="metallb-system/frr-k8s-z4zw5" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.234780 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67qhj\" (UniqueName: \"kubernetes.io/projected/775dcd57-e7c4-416d-a758-a5bb4ccc74ab-kube-api-access-67qhj\") pod \"frr-k8s-z4zw5\" (UID: \"775dcd57-e7c4-416d-a758-a5bb4ccc74ab\") " pod="metallb-system/frr-k8s-z4zw5" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.234807 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/775dcd57-e7c4-416d-a758-a5bb4ccc74ab-frr-startup\") pod \"frr-k8s-z4zw5\" (UID: \"775dcd57-e7c4-416d-a758-a5bb4ccc74ab\") " pod="metallb-system/frr-k8s-z4zw5" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.234825 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/775dcd57-e7c4-416d-a758-a5bb4ccc74ab-metrics\") pod \"frr-k8s-z4zw5\" (UID: \"775dcd57-e7c4-416d-a758-a5bb4ccc74ab\") " pod="metallb-system/frr-k8s-z4zw5" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.234850 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj8lk\" (UniqueName: \"kubernetes.io/projected/d13f776e-2828-4557-9abf-1d55eab1cf73-kube-api-access-zj8lk\") pod \"frr-k8s-webhook-server-bcc4b6f68-9mlqc\" (UID: \"d13f776e-2828-4557-9abf-1d55eab1cf73\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-9mlqc" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.234874 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/775dcd57-e7c4-416d-a758-a5bb4ccc74ab-frr-sockets\") pod \"frr-k8s-z4zw5\" (UID: \"775dcd57-e7c4-416d-a758-a5bb4ccc74ab\") " pod="metallb-system/frr-k8s-z4zw5" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.234903 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d13f776e-2828-4557-9abf-1d55eab1cf73-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-9mlqc\" (UID: \"d13f776e-2828-4557-9abf-1d55eab1cf73\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-9mlqc" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.243741 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-vd9n2"] Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.336105 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxwv5\" (UniqueName: \"kubernetes.io/projected/c6259267-1d63-453e-aefd-5eb03b54f532-kube-api-access-cxwv5\") pod \"controller-7bb4cc7c98-vd9n2\" (UID: \"c6259267-1d63-453e-aefd-5eb03b54f532\") " pod="metallb-system/controller-7bb4cc7c98-vd9n2" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.336165 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6259267-1d63-453e-aefd-5eb03b54f532-metrics-certs\") pod \"controller-7bb4cc7c98-vd9n2\" (UID: \"c6259267-1d63-453e-aefd-5eb03b54f532\") " pod="metallb-system/controller-7bb4cc7c98-vd9n2" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.336243 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8d2a4955-e0a0-42e0-86f5-4812f49a2553-metallb-excludel2\") pod \"speaker-vltvb\" (UID: \"8d2a4955-e0a0-42e0-86f5-4812f49a2553\") " pod="metallb-system/speaker-vltvb" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.336295 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/775dcd57-e7c4-416d-a758-a5bb4ccc74ab-metrics-certs\") pod \"frr-k8s-z4zw5\" (UID: \"775dcd57-e7c4-416d-a758-a5bb4ccc74ab\") " pod="metallb-system/frr-k8s-z4zw5" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.336335 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/775dcd57-e7c4-416d-a758-a5bb4ccc74ab-frr-conf\") pod \"frr-k8s-z4zw5\" (UID: \"775dcd57-e7c4-416d-a758-a5bb4ccc74ab\") " pod="metallb-system/frr-k8s-z4zw5" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.336367 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/775dcd57-e7c4-416d-a758-a5bb4ccc74ab-reloader\") pod \"frr-k8s-z4zw5\" (UID: \"775dcd57-e7c4-416d-a758-a5bb4ccc74ab\") " pod="metallb-system/frr-k8s-z4zw5" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.336390 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8d2a4955-e0a0-42e0-86f5-4812f49a2553-metrics-certs\") pod \"speaker-vltvb\" (UID: \"8d2a4955-e0a0-42e0-86f5-4812f49a2553\") " pod="metallb-system/speaker-vltvb" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.336416 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8d2a4955-e0a0-42e0-86f5-4812f49a2553-memberlist\") pod \"speaker-vltvb\" (UID: \"8d2a4955-e0a0-42e0-86f5-4812f49a2553\") " pod="metallb-system/speaker-vltvb" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.336438 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6259267-1d63-453e-aefd-5eb03b54f532-cert\") pod \"controller-7bb4cc7c98-vd9n2\" (UID: \"c6259267-1d63-453e-aefd-5eb03b54f532\") " pod="metallb-system/controller-7bb4cc7c98-vd9n2" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.336485 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67qhj\" (UniqueName: \"kubernetes.io/projected/775dcd57-e7c4-416d-a758-a5bb4ccc74ab-kube-api-access-67qhj\") pod \"frr-k8s-z4zw5\" (UID: \"775dcd57-e7c4-416d-a758-a5bb4ccc74ab\") " pod="metallb-system/frr-k8s-z4zw5" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.336506 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/775dcd57-e7c4-416d-a758-a5bb4ccc74ab-metrics\") pod \"frr-k8s-z4zw5\" (UID: \"775dcd57-e7c4-416d-a758-a5bb4ccc74ab\") " pod="metallb-system/frr-k8s-z4zw5" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.336525 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/775dcd57-e7c4-416d-a758-a5bb4ccc74ab-frr-startup\") pod \"frr-k8s-z4zw5\" (UID: \"775dcd57-e7c4-416d-a758-a5bb4ccc74ab\") " pod="metallb-system/frr-k8s-z4zw5" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.336547 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj8lk\" (UniqueName: \"kubernetes.io/projected/d13f776e-2828-4557-9abf-1d55eab1cf73-kube-api-access-zj8lk\") pod \"frr-k8s-webhook-server-bcc4b6f68-9mlqc\" (UID: \"d13f776e-2828-4557-9abf-1d55eab1cf73\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-9mlqc" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.336567 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/775dcd57-e7c4-416d-a758-a5bb4ccc74ab-frr-sockets\") pod \"frr-k8s-z4zw5\" (UID: \"775dcd57-e7c4-416d-a758-a5bb4ccc74ab\") " pod="metallb-system/frr-k8s-z4zw5" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.336598 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqtqn\" (UniqueName: \"kubernetes.io/projected/8d2a4955-e0a0-42e0-86f5-4812f49a2553-kube-api-access-tqtqn\") pod \"speaker-vltvb\" (UID: \"8d2a4955-e0a0-42e0-86f5-4812f49a2553\") " pod="metallb-system/speaker-vltvb" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.336624 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d13f776e-2828-4557-9abf-1d55eab1cf73-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-9mlqc\" (UID: \"d13f776e-2828-4557-9abf-1d55eab1cf73\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-9mlqc" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.336645 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w79ks\" (UniqueName: \"kubernetes.io/projected/881b13b7-beaa-4807-a523-329fd35bb96d-kube-api-access-w79ks\") pod \"auto-csr-approver-29565572-h8tvp\" (UID: \"881b13b7-beaa-4807-a523-329fd35bb96d\") " pod="openshift-infra/auto-csr-approver-29565572-h8tvp" Mar 19 15:32:00 crc kubenswrapper[4771]: E0319 15:32:00.337166 4771 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 19 15:32:00 crc kubenswrapper[4771]: E0319 15:32:00.337264 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d13f776e-2828-4557-9abf-1d55eab1cf73-cert podName:d13f776e-2828-4557-9abf-1d55eab1cf73 nodeName:}" failed. No retries permitted until 2026-03-19 15:32:00.837241412 +0000 UTC m=+980.065862814 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d13f776e-2828-4557-9abf-1d55eab1cf73-cert") pod "frr-k8s-webhook-server-bcc4b6f68-9mlqc" (UID: "d13f776e-2828-4557-9abf-1d55eab1cf73") : secret "frr-k8s-webhook-server-cert" not found Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.337318 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/775dcd57-e7c4-416d-a758-a5bb4ccc74ab-frr-conf\") pod \"frr-k8s-z4zw5\" (UID: \"775dcd57-e7c4-416d-a758-a5bb4ccc74ab\") " pod="metallb-system/frr-k8s-z4zw5" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.337569 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/775dcd57-e7c4-416d-a758-a5bb4ccc74ab-frr-sockets\") pod \"frr-k8s-z4zw5\" (UID: \"775dcd57-e7c4-416d-a758-a5bb4ccc74ab\") " pod="metallb-system/frr-k8s-z4zw5" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.337799 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/775dcd57-e7c4-416d-a758-a5bb4ccc74ab-reloader\") pod \"frr-k8s-z4zw5\" (UID: \"775dcd57-e7c4-416d-a758-a5bb4ccc74ab\") " pod="metallb-system/frr-k8s-z4zw5" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.337911 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/775dcd57-e7c4-416d-a758-a5bb4ccc74ab-metrics\") pod \"frr-k8s-z4zw5\" (UID: \"775dcd57-e7c4-416d-a758-a5bb4ccc74ab\") " pod="metallb-system/frr-k8s-z4zw5" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.339342 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/775dcd57-e7c4-416d-a758-a5bb4ccc74ab-frr-startup\") pod \"frr-k8s-z4zw5\" (UID: \"775dcd57-e7c4-416d-a758-a5bb4ccc74ab\") " pod="metallb-system/frr-k8s-z4zw5" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.343327 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/775dcd57-e7c4-416d-a758-a5bb4ccc74ab-metrics-certs\") pod \"frr-k8s-z4zw5\" (UID: \"775dcd57-e7c4-416d-a758-a5bb4ccc74ab\") " pod="metallb-system/frr-k8s-z4zw5" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.353273 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67qhj\" (UniqueName: \"kubernetes.io/projected/775dcd57-e7c4-416d-a758-a5bb4ccc74ab-kube-api-access-67qhj\") pod \"frr-k8s-z4zw5\" (UID: \"775dcd57-e7c4-416d-a758-a5bb4ccc74ab\") " pod="metallb-system/frr-k8s-z4zw5" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.359493 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj8lk\" (UniqueName: \"kubernetes.io/projected/d13f776e-2828-4557-9abf-1d55eab1cf73-kube-api-access-zj8lk\") pod \"frr-k8s-webhook-server-bcc4b6f68-9mlqc\" (UID: \"d13f776e-2828-4557-9abf-1d55eab1cf73\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-9mlqc" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.438687 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8d2a4955-e0a0-42e0-86f5-4812f49a2553-metrics-certs\") pod \"speaker-vltvb\" (UID: \"8d2a4955-e0a0-42e0-86f5-4812f49a2553\") " pod="metallb-system/speaker-vltvb" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.439901 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8d2a4955-e0a0-42e0-86f5-4812f49a2553-memberlist\") pod \"speaker-vltvb\" (UID: \"8d2a4955-e0a0-42e0-86f5-4812f49a2553\") " pod="metallb-system/speaker-vltvb" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.440221 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6259267-1d63-453e-aefd-5eb03b54f532-cert\") pod \"controller-7bb4cc7c98-vd9n2\" (UID: \"c6259267-1d63-453e-aefd-5eb03b54f532\") " pod="metallb-system/controller-7bb4cc7c98-vd9n2" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.440525 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqtqn\" (UniqueName: \"kubernetes.io/projected/8d2a4955-e0a0-42e0-86f5-4812f49a2553-kube-api-access-tqtqn\") pod \"speaker-vltvb\" (UID: \"8d2a4955-e0a0-42e0-86f5-4812f49a2553\") " pod="metallb-system/speaker-vltvb" Mar 19 15:32:00 crc kubenswrapper[4771]: E0319 15:32:00.440162 4771 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 19 15:32:00 crc kubenswrapper[4771]: E0319 15:32:00.440758 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d2a4955-e0a0-42e0-86f5-4812f49a2553-memberlist podName:8d2a4955-e0a0-42e0-86f5-4812f49a2553 nodeName:}" failed. No retries permitted until 2026-03-19 15:32:00.940734163 +0000 UTC m=+980.169355595 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/8d2a4955-e0a0-42e0-86f5-4812f49a2553-memberlist") pod "speaker-vltvb" (UID: "8d2a4955-e0a0-42e0-86f5-4812f49a2553") : secret "metallb-memberlist" not found Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.440679 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w79ks\" (UniqueName: \"kubernetes.io/projected/881b13b7-beaa-4807-a523-329fd35bb96d-kube-api-access-w79ks\") pod \"auto-csr-approver-29565572-h8tvp\" (UID: \"881b13b7-beaa-4807-a523-329fd35bb96d\") " pod="openshift-infra/auto-csr-approver-29565572-h8tvp" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.440915 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxwv5\" (UniqueName: \"kubernetes.io/projected/c6259267-1d63-453e-aefd-5eb03b54f532-kube-api-access-cxwv5\") pod \"controller-7bb4cc7c98-vd9n2\" (UID: \"c6259267-1d63-453e-aefd-5eb03b54f532\") " pod="metallb-system/controller-7bb4cc7c98-vd9n2" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.440972 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6259267-1d63-453e-aefd-5eb03b54f532-metrics-certs\") pod \"controller-7bb4cc7c98-vd9n2\" (UID: \"c6259267-1d63-453e-aefd-5eb03b54f532\") " pod="metallb-system/controller-7bb4cc7c98-vd9n2" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.441028 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8d2a4955-e0a0-42e0-86f5-4812f49a2553-metallb-excludel2\") pod \"speaker-vltvb\" (UID: \"8d2a4955-e0a0-42e0-86f5-4812f49a2553\") " pod="metallb-system/speaker-vltvb" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.441892 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8d2a4955-e0a0-42e0-86f5-4812f49a2553-metallb-excludel2\") pod \"speaker-vltvb\" (UID: \"8d2a4955-e0a0-42e0-86f5-4812f49a2553\") " pod="metallb-system/speaker-vltvb" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.442846 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.443285 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8d2a4955-e0a0-42e0-86f5-4812f49a2553-metrics-certs\") pod \"speaker-vltvb\" (UID: \"8d2a4955-e0a0-42e0-86f5-4812f49a2553\") " pod="metallb-system/speaker-vltvb" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.445686 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6259267-1d63-453e-aefd-5eb03b54f532-metrics-certs\") pod \"controller-7bb4cc7c98-vd9n2\" (UID: \"c6259267-1d63-453e-aefd-5eb03b54f532\") " pod="metallb-system/controller-7bb4cc7c98-vd9n2" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.457828 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6259267-1d63-453e-aefd-5eb03b54f532-cert\") pod \"controller-7bb4cc7c98-vd9n2\" (UID: \"c6259267-1d63-453e-aefd-5eb03b54f532\") " pod="metallb-system/controller-7bb4cc7c98-vd9n2" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.458371 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w79ks\" (UniqueName: \"kubernetes.io/projected/881b13b7-beaa-4807-a523-329fd35bb96d-kube-api-access-w79ks\") pod \"auto-csr-approver-29565572-h8tvp\" (UID: \"881b13b7-beaa-4807-a523-329fd35bb96d\") " pod="openshift-infra/auto-csr-approver-29565572-h8tvp" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.459098 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxwv5\" (UniqueName: \"kubernetes.io/projected/c6259267-1d63-453e-aefd-5eb03b54f532-kube-api-access-cxwv5\") pod \"controller-7bb4cc7c98-vd9n2\" (UID: \"c6259267-1d63-453e-aefd-5eb03b54f532\") " pod="metallb-system/controller-7bb4cc7c98-vd9n2" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.459732 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqtqn\" (UniqueName: \"kubernetes.io/projected/8d2a4955-e0a0-42e0-86f5-4812f49a2553-kube-api-access-tqtqn\") pod \"speaker-vltvb\" (UID: \"8d2a4955-e0a0-42e0-86f5-4812f49a2553\") " pod="metallb-system/speaker-vltvb" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.469871 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-z4zw5" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.517525 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565572-h8tvp" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.570194 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-vd9n2" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.793714 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-vd9n2"] Mar 19 15:32:00 crc kubenswrapper[4771]: W0319 15:32:00.803069 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6259267_1d63_453e_aefd_5eb03b54f532.slice/crio-fe73873cffc9186e806cfb3179900abab474f1749a3628f470413dfcf96ab195 WatchSource:0}: Error finding container fe73873cffc9186e806cfb3179900abab474f1749a3628f470413dfcf96ab195: Status 404 returned error can't find the container with id fe73873cffc9186e806cfb3179900abab474f1749a3628f470413dfcf96ab195 Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.846696 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d13f776e-2828-4557-9abf-1d55eab1cf73-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-9mlqc\" (UID: \"d13f776e-2828-4557-9abf-1d55eab1cf73\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-9mlqc" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.851787 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-vd9n2" event={"ID":"c6259267-1d63-453e-aefd-5eb03b54f532","Type":"ContainerStarted","Data":"fe73873cffc9186e806cfb3179900abab474f1749a3628f470413dfcf96ab195"} Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.852034 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d13f776e-2828-4557-9abf-1d55eab1cf73-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-9mlqc\" (UID: \"d13f776e-2828-4557-9abf-1d55eab1cf73\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-9mlqc" Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.852999 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-z4zw5" event={"ID":"775dcd57-e7c4-416d-a758-a5bb4ccc74ab","Type":"ContainerStarted","Data":"ad2c1cf29fd2c38a7328ca5188d6374b23d5e179513bcd726df4bfcc2bbe3459"} Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.919231 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565572-h8tvp"] Mar 19 15:32:00 crc kubenswrapper[4771]: W0319 15:32:00.922681 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod881b13b7_beaa_4807_a523_329fd35bb96d.slice/crio-85f0e1e6ae3a66c42ffc9c99613191f26005f53feaec5ed206c57e2d7d0c949b WatchSource:0}: Error finding container 85f0e1e6ae3a66c42ffc9c99613191f26005f53feaec5ed206c57e2d7d0c949b: Status 404 returned error can't find the container with id 85f0e1e6ae3a66c42ffc9c99613191f26005f53feaec5ed206c57e2d7d0c949b Mar 19 15:32:00 crc kubenswrapper[4771]: I0319 15:32:00.948400 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8d2a4955-e0a0-42e0-86f5-4812f49a2553-memberlist\") pod \"speaker-vltvb\" (UID: \"8d2a4955-e0a0-42e0-86f5-4812f49a2553\") " pod="metallb-system/speaker-vltvb" Mar 19 15:32:00 crc kubenswrapper[4771]: E0319 15:32:00.948544 4771 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 19 15:32:00 crc kubenswrapper[4771]: E0319 15:32:00.948589 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d2a4955-e0a0-42e0-86f5-4812f49a2553-memberlist podName:8d2a4955-e0a0-42e0-86f5-4812f49a2553 nodeName:}" failed. No retries permitted until 2026-03-19 15:32:01.948574816 +0000 UTC m=+981.177196018 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/8d2a4955-e0a0-42e0-86f5-4812f49a2553-memberlist") pod "speaker-vltvb" (UID: "8d2a4955-e0a0-42e0-86f5-4812f49a2553") : secret "metallb-memberlist" not found Mar 19 15:32:01 crc kubenswrapper[4771]: I0319 15:32:01.054005 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-9mlqc" Mar 19 15:32:01 crc kubenswrapper[4771]: I0319 15:32:01.315839 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-9mlqc"] Mar 19 15:32:01 crc kubenswrapper[4771]: I0319 15:32:01.862210 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-9mlqc" event={"ID":"d13f776e-2828-4557-9abf-1d55eab1cf73","Type":"ContainerStarted","Data":"fc9062c68d2bfddb4d13f8a04bd1b908dc60ba0f665770cd11aad4732332027d"} Mar 19 15:32:01 crc kubenswrapper[4771]: I0319 15:32:01.865911 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-vd9n2" event={"ID":"c6259267-1d63-453e-aefd-5eb03b54f532","Type":"ContainerStarted","Data":"8b380f571aa95212f0a943279f8595e3e8368aff1a3ae020b61ca828e0627443"} Mar 19 15:32:01 crc kubenswrapper[4771]: I0319 15:32:01.865972 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-vd9n2" event={"ID":"c6259267-1d63-453e-aefd-5eb03b54f532","Type":"ContainerStarted","Data":"d0d7f0a4065d2d6b87b54c60354c4cbb3eadd6bc2b9db0c0d88e8e662fbeb1f5"} Mar 19 15:32:01 crc kubenswrapper[4771]: I0319 15:32:01.866221 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-vd9n2" Mar 19 15:32:01 crc kubenswrapper[4771]: I0319 15:32:01.867202 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565572-h8tvp" event={"ID":"881b13b7-beaa-4807-a523-329fd35bb96d","Type":"ContainerStarted","Data":"85f0e1e6ae3a66c42ffc9c99613191f26005f53feaec5ed206c57e2d7d0c949b"} Mar 19 15:32:01 crc kubenswrapper[4771]: I0319 15:32:01.867334 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rjgb8" podUID="d339b8d5-d6f0-46f3-8505-191ce6754fa0" containerName="registry-server" containerID="cri-o://8fb0fa4d9d811d8e97a9f041abc5d5d370aace3e0ce94a3bb02f112f85a02d7a" gracePeriod=2 Mar 19 15:32:01 crc kubenswrapper[4771]: I0319 15:32:01.891190 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-vd9n2" podStartSLOduration=1.891170373 podStartE2EDuration="1.891170373s" podCreationTimestamp="2026-03-19 15:32:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:32:01.885511596 +0000 UTC m=+981.114132818" watchObservedRunningTime="2026-03-19 15:32:01.891170373 +0000 UTC m=+981.119791585" Mar 19 15:32:01 crc kubenswrapper[4771]: I0319 15:32:01.974186 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8d2a4955-e0a0-42e0-86f5-4812f49a2553-memberlist\") pod \"speaker-vltvb\" (UID: \"8d2a4955-e0a0-42e0-86f5-4812f49a2553\") " pod="metallb-system/speaker-vltvb" Mar 19 15:32:01 crc kubenswrapper[4771]: I0319 15:32:01.981699 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8d2a4955-e0a0-42e0-86f5-4812f49a2553-memberlist\") pod \"speaker-vltvb\" (UID: \"8d2a4955-e0a0-42e0-86f5-4812f49a2553\") " pod="metallb-system/speaker-vltvb" Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.061499 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-vltvb" Mar 19 15:32:02 crc kubenswrapper[4771]: W0319 15:32:02.080237 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d2a4955_e0a0_42e0_86f5_4812f49a2553.slice/crio-5f97d36b3a929f81d9c7661c32aa8225954700839e30f1c4529791e0557788cc WatchSource:0}: Error finding container 5f97d36b3a929f81d9c7661c32aa8225954700839e30f1c4529791e0557788cc: Status 404 returned error can't find the container with id 5f97d36b3a929f81d9c7661c32aa8225954700839e30f1c4529791e0557788cc Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.503959 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rjgb8" Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.582894 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d339b8d5-d6f0-46f3-8505-191ce6754fa0-catalog-content\") pod \"d339b8d5-d6f0-46f3-8505-191ce6754fa0\" (UID: \"d339b8d5-d6f0-46f3-8505-191ce6754fa0\") " Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.582970 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9c5jb\" (UniqueName: \"kubernetes.io/projected/d339b8d5-d6f0-46f3-8505-191ce6754fa0-kube-api-access-9c5jb\") pod \"d339b8d5-d6f0-46f3-8505-191ce6754fa0\" (UID: \"d339b8d5-d6f0-46f3-8505-191ce6754fa0\") " Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.583044 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d339b8d5-d6f0-46f3-8505-191ce6754fa0-utilities\") pod \"d339b8d5-d6f0-46f3-8505-191ce6754fa0\" (UID: \"d339b8d5-d6f0-46f3-8505-191ce6754fa0\") " Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.584473 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d339b8d5-d6f0-46f3-8505-191ce6754fa0-utilities" (OuterVolumeSpecName: "utilities") pod "d339b8d5-d6f0-46f3-8505-191ce6754fa0" (UID: "d339b8d5-d6f0-46f3-8505-191ce6754fa0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.590396 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d339b8d5-d6f0-46f3-8505-191ce6754fa0-kube-api-access-9c5jb" (OuterVolumeSpecName: "kube-api-access-9c5jb") pod "d339b8d5-d6f0-46f3-8505-191ce6754fa0" (UID: "d339b8d5-d6f0-46f3-8505-191ce6754fa0"). InnerVolumeSpecName "kube-api-access-9c5jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.684416 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9c5jb\" (UniqueName: \"kubernetes.io/projected/d339b8d5-d6f0-46f3-8505-191ce6754fa0-kube-api-access-9c5jb\") on node \"crc\" DevicePath \"\"" Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.684762 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d339b8d5-d6f0-46f3-8505-191ce6754fa0-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.702291 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d339b8d5-d6f0-46f3-8505-191ce6754fa0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d339b8d5-d6f0-46f3-8505-191ce6754fa0" (UID: "d339b8d5-d6f0-46f3-8505-191ce6754fa0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.785608 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d339b8d5-d6f0-46f3-8505-191ce6754fa0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.875319 4771 generic.go:334] "Generic (PLEG): container finished" podID="d339b8d5-d6f0-46f3-8505-191ce6754fa0" containerID="8fb0fa4d9d811d8e97a9f041abc5d5d370aace3e0ce94a3bb02f112f85a02d7a" exitCode=0 Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.875385 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjgb8" event={"ID":"d339b8d5-d6f0-46f3-8505-191ce6754fa0","Type":"ContainerDied","Data":"8fb0fa4d9d811d8e97a9f041abc5d5d370aace3e0ce94a3bb02f112f85a02d7a"} Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.875404 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rjgb8" Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.875419 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjgb8" event={"ID":"d339b8d5-d6f0-46f3-8505-191ce6754fa0","Type":"ContainerDied","Data":"2d8035110d5e5fa3b475002bcb165085b40fc0a457b8980e3e01578b95c873f6"} Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.875434 4771 scope.go:117] "RemoveContainer" containerID="8fb0fa4d9d811d8e97a9f041abc5d5d370aace3e0ce94a3bb02f112f85a02d7a" Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.876741 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565572-h8tvp" event={"ID":"881b13b7-beaa-4807-a523-329fd35bb96d","Type":"ContainerStarted","Data":"56d67677ac0ab0f0d1788bbe3b70630faeb581ca3304718c54706eeed92d801d"} Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.885024 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vltvb" event={"ID":"8d2a4955-e0a0-42e0-86f5-4812f49a2553","Type":"ContainerStarted","Data":"4ce36f8b24930b392a658c70f2346f4ca427de5768835ed352015da736e12648"} Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.885089 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vltvb" event={"ID":"8d2a4955-e0a0-42e0-86f5-4812f49a2553","Type":"ContainerStarted","Data":"b60e674da580495664725cec13d06b937a6bc9c822a756693cdb73f393f8bec3"} Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.885103 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vltvb" event={"ID":"8d2a4955-e0a0-42e0-86f5-4812f49a2553","Type":"ContainerStarted","Data":"5f97d36b3a929f81d9c7661c32aa8225954700839e30f1c4529791e0557788cc"} Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.885377 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-vltvb" Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.895962 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565572-h8tvp" podStartSLOduration=1.265276668 podStartE2EDuration="2.895944936s" podCreationTimestamp="2026-03-19 15:32:00 +0000 UTC" firstStartedPulling="2026-03-19 15:32:00.925635494 +0000 UTC m=+980.154256696" lastFinishedPulling="2026-03-19 15:32:02.556303762 +0000 UTC m=+981.784924964" observedRunningTime="2026-03-19 15:32:02.893568279 +0000 UTC m=+982.122189501" watchObservedRunningTime="2026-03-19 15:32:02.895944936 +0000 UTC m=+982.124566138" Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.897587 4771 scope.go:117] "RemoveContainer" containerID="f7140faf79782235f5b52b8117b82c9d75a532761fa713bbfd00ddb7c7a2df87" Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.912699 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-vltvb" podStartSLOduration=2.9126836689999998 podStartE2EDuration="2.912683669s" podCreationTimestamp="2026-03-19 15:32:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:32:02.91228834 +0000 UTC m=+982.140909562" watchObservedRunningTime="2026-03-19 15:32:02.912683669 +0000 UTC m=+982.141304871" Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.929348 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjgb8"] Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.934311 4771 scope.go:117] "RemoveContainer" containerID="250993272cf1f5aae9524b435e7878e3895805b1e2c40b7b7f7cef0bdb293f48" Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.951349 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjgb8"] Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.955800 4771 scope.go:117] "RemoveContainer" containerID="8fb0fa4d9d811d8e97a9f041abc5d5d370aace3e0ce94a3bb02f112f85a02d7a" Mar 19 15:32:02 crc kubenswrapper[4771]: E0319 15:32:02.956398 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fb0fa4d9d811d8e97a9f041abc5d5d370aace3e0ce94a3bb02f112f85a02d7a\": container with ID starting with 8fb0fa4d9d811d8e97a9f041abc5d5d370aace3e0ce94a3bb02f112f85a02d7a not found: ID does not exist" containerID="8fb0fa4d9d811d8e97a9f041abc5d5d370aace3e0ce94a3bb02f112f85a02d7a" Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.956464 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fb0fa4d9d811d8e97a9f041abc5d5d370aace3e0ce94a3bb02f112f85a02d7a"} err="failed to get container status \"8fb0fa4d9d811d8e97a9f041abc5d5d370aace3e0ce94a3bb02f112f85a02d7a\": rpc error: code = NotFound desc = could not find container \"8fb0fa4d9d811d8e97a9f041abc5d5d370aace3e0ce94a3bb02f112f85a02d7a\": container with ID starting with 8fb0fa4d9d811d8e97a9f041abc5d5d370aace3e0ce94a3bb02f112f85a02d7a not found: ID does not exist" Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.956497 4771 scope.go:117] "RemoveContainer" containerID="f7140faf79782235f5b52b8117b82c9d75a532761fa713bbfd00ddb7c7a2df87" Mar 19 15:32:02 crc kubenswrapper[4771]: E0319 15:32:02.956838 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7140faf79782235f5b52b8117b82c9d75a532761fa713bbfd00ddb7c7a2df87\": container with ID starting with f7140faf79782235f5b52b8117b82c9d75a532761fa713bbfd00ddb7c7a2df87 not found: ID does not exist" containerID="f7140faf79782235f5b52b8117b82c9d75a532761fa713bbfd00ddb7c7a2df87" Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.956876 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7140faf79782235f5b52b8117b82c9d75a532761fa713bbfd00ddb7c7a2df87"} err="failed to get container status \"f7140faf79782235f5b52b8117b82c9d75a532761fa713bbfd00ddb7c7a2df87\": rpc error: code = NotFound desc = could not find container \"f7140faf79782235f5b52b8117b82c9d75a532761fa713bbfd00ddb7c7a2df87\": container with ID starting with f7140faf79782235f5b52b8117b82c9d75a532761fa713bbfd00ddb7c7a2df87 not found: ID does not exist" Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.956900 4771 scope.go:117] "RemoveContainer" containerID="250993272cf1f5aae9524b435e7878e3895805b1e2c40b7b7f7cef0bdb293f48" Mar 19 15:32:02 crc kubenswrapper[4771]: E0319 15:32:02.957380 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"250993272cf1f5aae9524b435e7878e3895805b1e2c40b7b7f7cef0bdb293f48\": container with ID starting with 250993272cf1f5aae9524b435e7878e3895805b1e2c40b7b7f7cef0bdb293f48 not found: ID does not exist" containerID="250993272cf1f5aae9524b435e7878e3895805b1e2c40b7b7f7cef0bdb293f48" Mar 19 15:32:02 crc kubenswrapper[4771]: I0319 15:32:02.957423 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"250993272cf1f5aae9524b435e7878e3895805b1e2c40b7b7f7cef0bdb293f48"} err="failed to get container status \"250993272cf1f5aae9524b435e7878e3895805b1e2c40b7b7f7cef0bdb293f48\": rpc error: code = NotFound desc = could not find container \"250993272cf1f5aae9524b435e7878e3895805b1e2c40b7b7f7cef0bdb293f48\": container with ID starting with 250993272cf1f5aae9524b435e7878e3895805b1e2c40b7b7f7cef0bdb293f48 not found: ID does not exist" Mar 19 15:32:03 crc kubenswrapper[4771]: I0319 15:32:03.516899 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d339b8d5-d6f0-46f3-8505-191ce6754fa0" path="/var/lib/kubelet/pods/d339b8d5-d6f0-46f3-8505-191ce6754fa0/volumes" Mar 19 15:32:03 crc kubenswrapper[4771]: I0319 15:32:03.903921 4771 generic.go:334] "Generic (PLEG): container finished" podID="881b13b7-beaa-4807-a523-329fd35bb96d" containerID="56d67677ac0ab0f0d1788bbe3b70630faeb581ca3304718c54706eeed92d801d" exitCode=0 Mar 19 15:32:03 crc kubenswrapper[4771]: I0319 15:32:03.904637 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565572-h8tvp" event={"ID":"881b13b7-beaa-4807-a523-329fd35bb96d","Type":"ContainerDied","Data":"56d67677ac0ab0f0d1788bbe3b70630faeb581ca3304718c54706eeed92d801d"} Mar 19 15:32:05 crc kubenswrapper[4771]: I0319 15:32:05.173521 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565572-h8tvp" Mar 19 15:32:05 crc kubenswrapper[4771]: I0319 15:32:05.227042 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w79ks\" (UniqueName: \"kubernetes.io/projected/881b13b7-beaa-4807-a523-329fd35bb96d-kube-api-access-w79ks\") pod \"881b13b7-beaa-4807-a523-329fd35bb96d\" (UID: \"881b13b7-beaa-4807-a523-329fd35bb96d\") " Mar 19 15:32:05 crc kubenswrapper[4771]: I0319 15:32:05.233339 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/881b13b7-beaa-4807-a523-329fd35bb96d-kube-api-access-w79ks" (OuterVolumeSpecName: "kube-api-access-w79ks") pod "881b13b7-beaa-4807-a523-329fd35bb96d" (UID: "881b13b7-beaa-4807-a523-329fd35bb96d"). InnerVolumeSpecName "kube-api-access-w79ks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:32:05 crc kubenswrapper[4771]: I0319 15:32:05.328709 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w79ks\" (UniqueName: \"kubernetes.io/projected/881b13b7-beaa-4807-a523-329fd35bb96d-kube-api-access-w79ks\") on node \"crc\" DevicePath \"\"" Mar 19 15:32:05 crc kubenswrapper[4771]: I0319 15:32:05.930293 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565572-h8tvp" event={"ID":"881b13b7-beaa-4807-a523-329fd35bb96d","Type":"ContainerDied","Data":"85f0e1e6ae3a66c42ffc9c99613191f26005f53feaec5ed206c57e2d7d0c949b"} Mar 19 15:32:05 crc kubenswrapper[4771]: I0319 15:32:05.930370 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85f0e1e6ae3a66c42ffc9c99613191f26005f53feaec5ed206c57e2d7d0c949b" Mar 19 15:32:05 crc kubenswrapper[4771]: I0319 15:32:05.930332 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565572-h8tvp" Mar 19 15:32:05 crc kubenswrapper[4771]: I0319 15:32:05.956815 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565566-cpkp9"] Mar 19 15:32:05 crc kubenswrapper[4771]: I0319 15:32:05.960808 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565566-cpkp9"] Mar 19 15:32:07 crc kubenswrapper[4771]: I0319 15:32:07.521852 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51ce2534-ab98-47b1-8311-eb4f4d13e0dc" path="/var/lib/kubelet/pods/51ce2534-ab98-47b1-8311-eb4f4d13e0dc/volumes" Mar 19 15:32:08 crc kubenswrapper[4771]: I0319 15:32:08.953746 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-9mlqc" event={"ID":"d13f776e-2828-4557-9abf-1d55eab1cf73","Type":"ContainerStarted","Data":"8a7255ce4f83364a309440c56620cc531bfcb6a9c258d874f2c56e156d673ca5"} Mar 19 15:32:08 crc kubenswrapper[4771]: I0319 15:32:08.954090 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-9mlqc" Mar 19 15:32:08 crc kubenswrapper[4771]: I0319 15:32:08.955620 4771 generic.go:334] "Generic (PLEG): container finished" podID="775dcd57-e7c4-416d-a758-a5bb4ccc74ab" containerID="ab583611ee0487ba24887f44dbded7c8181cea455289c7cb623d501bab007978" exitCode=0 Mar 19 15:32:08 crc kubenswrapper[4771]: I0319 15:32:08.955683 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-z4zw5" event={"ID":"775dcd57-e7c4-416d-a758-a5bb4ccc74ab","Type":"ContainerDied","Data":"ab583611ee0487ba24887f44dbded7c8181cea455289c7cb623d501bab007978"} Mar 19 15:32:08 crc kubenswrapper[4771]: I0319 15:32:08.971930 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-9mlqc" podStartSLOduration=2.145490825 podStartE2EDuration="8.971907797s" podCreationTimestamp="2026-03-19 15:32:00 +0000 UTC" firstStartedPulling="2026-03-19 15:32:01.323210863 +0000 UTC m=+980.551832065" lastFinishedPulling="2026-03-19 15:32:08.149627795 +0000 UTC m=+987.378249037" observedRunningTime="2026-03-19 15:32:08.969196831 +0000 UTC m=+988.197818033" watchObservedRunningTime="2026-03-19 15:32:08.971907797 +0000 UTC m=+988.200528999" Mar 19 15:32:09 crc kubenswrapper[4771]: I0319 15:32:09.967325 4771 generic.go:334] "Generic (PLEG): container finished" podID="775dcd57-e7c4-416d-a758-a5bb4ccc74ab" containerID="16754b459ac2fd97ed16a7337897d96486656a1de49a7e9825519dffce4e066e" exitCode=0 Mar 19 15:32:09 crc kubenswrapper[4771]: I0319 15:32:09.967420 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-z4zw5" event={"ID":"775dcd57-e7c4-416d-a758-a5bb4ccc74ab","Type":"ContainerDied","Data":"16754b459ac2fd97ed16a7337897d96486656a1de49a7e9825519dffce4e066e"} Mar 19 15:32:10 crc kubenswrapper[4771]: I0319 15:32:10.574567 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-vd9n2" Mar 19 15:32:10 crc kubenswrapper[4771]: I0319 15:32:10.980066 4771 generic.go:334] "Generic (PLEG): container finished" podID="775dcd57-e7c4-416d-a758-a5bb4ccc74ab" containerID="e01ee012ff08a90090044b4392daa002c608387a0da83c05dd4b5230e57b94fe" exitCode=0 Mar 19 15:32:10 crc kubenswrapper[4771]: I0319 15:32:10.980173 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-z4zw5" event={"ID":"775dcd57-e7c4-416d-a758-a5bb4ccc74ab","Type":"ContainerDied","Data":"e01ee012ff08a90090044b4392daa002c608387a0da83c05dd4b5230e57b94fe"} Mar 19 15:32:12 crc kubenswrapper[4771]: I0319 15:32:12.056554 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-z4zw5" event={"ID":"775dcd57-e7c4-416d-a758-a5bb4ccc74ab","Type":"ContainerStarted","Data":"9e8f7b4fc6a03657c07aac397acaa575c09c3acd880c1f8cdb22cf4015168225"} Mar 19 15:32:12 crc kubenswrapper[4771]: I0319 15:32:12.057634 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-z4zw5" event={"ID":"775dcd57-e7c4-416d-a758-a5bb4ccc74ab","Type":"ContainerStarted","Data":"36a173c3aabe2f829934be4d0dde5d5e8e9fb8823c75b7284222fce3a655c427"} Mar 19 15:32:12 crc kubenswrapper[4771]: I0319 15:32:12.057727 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-z4zw5" event={"ID":"775dcd57-e7c4-416d-a758-a5bb4ccc74ab","Type":"ContainerStarted","Data":"226ddf76147bb6308db67fe9a3656b383dc2470acfa49fe7c8f76ea93e50b6a4"} Mar 19 15:32:12 crc kubenswrapper[4771]: I0319 15:32:12.057802 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-z4zw5" event={"ID":"775dcd57-e7c4-416d-a758-a5bb4ccc74ab","Type":"ContainerStarted","Data":"cfe50068f395626b65ec3ecdf10c2cde0ba32dd40c968b9b5238d3af1dcc37b4"} Mar 19 15:32:12 crc kubenswrapper[4771]: I0319 15:32:12.057880 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-z4zw5" event={"ID":"775dcd57-e7c4-416d-a758-a5bb4ccc74ab","Type":"ContainerStarted","Data":"1da7ffa9e197f966870d3bc6288f32b34485561cb736d06df20ef73fd01e5256"} Mar 19 15:32:12 crc kubenswrapper[4771]: I0319 15:32:12.070057 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-vltvb" Mar 19 15:32:13 crc kubenswrapper[4771]: I0319 15:32:13.073146 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-z4zw5" event={"ID":"775dcd57-e7c4-416d-a758-a5bb4ccc74ab","Type":"ContainerStarted","Data":"04c15d3e54f1694e1bdf54d966dada03a9cbfcf704daf8ef9c0c4c653fb94a9b"} Mar 19 15:32:13 crc kubenswrapper[4771]: I0319 15:32:13.074251 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-z4zw5" Mar 19 15:32:13 crc kubenswrapper[4771]: I0319 15:32:13.112754 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-z4zw5" podStartSLOduration=5.740173703 podStartE2EDuration="13.1127268s" podCreationTimestamp="2026-03-19 15:32:00 +0000 UTC" firstStartedPulling="2026-03-19 15:32:00.73727963 +0000 UTC m=+979.965900832" lastFinishedPulling="2026-03-19 15:32:08.109832697 +0000 UTC m=+987.338453929" observedRunningTime="2026-03-19 15:32:13.104447851 +0000 UTC m=+992.333069093" watchObservedRunningTime="2026-03-19 15:32:13.1127268 +0000 UTC m=+992.341348032" Mar 19 15:32:15 crc kubenswrapper[4771]: I0319 15:32:15.055501 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-nzdrj"] Mar 19 15:32:15 crc kubenswrapper[4771]: E0319 15:32:15.056231 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d339b8d5-d6f0-46f3-8505-191ce6754fa0" containerName="extract-utilities" Mar 19 15:32:15 crc kubenswrapper[4771]: I0319 15:32:15.056256 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d339b8d5-d6f0-46f3-8505-191ce6754fa0" containerName="extract-utilities" Mar 19 15:32:15 crc kubenswrapper[4771]: E0319 15:32:15.056279 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="881b13b7-beaa-4807-a523-329fd35bb96d" containerName="oc" Mar 19 15:32:15 crc kubenswrapper[4771]: I0319 15:32:15.056304 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="881b13b7-beaa-4807-a523-329fd35bb96d" containerName="oc" Mar 19 15:32:15 crc kubenswrapper[4771]: E0319 15:32:15.056321 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d339b8d5-d6f0-46f3-8505-191ce6754fa0" containerName="extract-content" Mar 19 15:32:15 crc kubenswrapper[4771]: I0319 15:32:15.056333 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d339b8d5-d6f0-46f3-8505-191ce6754fa0" containerName="extract-content" Mar 19 15:32:15 crc kubenswrapper[4771]: E0319 15:32:15.056362 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d339b8d5-d6f0-46f3-8505-191ce6754fa0" containerName="registry-server" Mar 19 15:32:15 crc kubenswrapper[4771]: I0319 15:32:15.056372 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d339b8d5-d6f0-46f3-8505-191ce6754fa0" containerName="registry-server" Mar 19 15:32:15 crc kubenswrapper[4771]: I0319 15:32:15.056557 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d339b8d5-d6f0-46f3-8505-191ce6754fa0" containerName="registry-server" Mar 19 15:32:15 crc kubenswrapper[4771]: I0319 15:32:15.056575 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="881b13b7-beaa-4807-a523-329fd35bb96d" containerName="oc" Mar 19 15:32:15 crc kubenswrapper[4771]: I0319 15:32:15.057186 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nzdrj" Mar 19 15:32:15 crc kubenswrapper[4771]: I0319 15:32:15.060021 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 19 15:32:15 crc kubenswrapper[4771]: I0319 15:32:15.060378 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-pz9kw" Mar 19 15:32:15 crc kubenswrapper[4771]: I0319 15:32:15.060499 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlq9w\" (UniqueName: \"kubernetes.io/projected/50a95e4d-3f8a-4e3e-982b-d05768d1ad14-kube-api-access-zlq9w\") pod \"openstack-operator-index-nzdrj\" (UID: \"50a95e4d-3f8a-4e3e-982b-d05768d1ad14\") " pod="openstack-operators/openstack-operator-index-nzdrj" Mar 19 15:32:15 crc kubenswrapper[4771]: I0319 15:32:15.060582 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 19 15:32:15 crc kubenswrapper[4771]: I0319 15:32:15.073162 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nzdrj"] Mar 19 15:32:15 crc kubenswrapper[4771]: I0319 15:32:15.161393 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlq9w\" (UniqueName: \"kubernetes.io/projected/50a95e4d-3f8a-4e3e-982b-d05768d1ad14-kube-api-access-zlq9w\") pod \"openstack-operator-index-nzdrj\" (UID: \"50a95e4d-3f8a-4e3e-982b-d05768d1ad14\") " pod="openstack-operators/openstack-operator-index-nzdrj" Mar 19 15:32:15 crc kubenswrapper[4771]: I0319 15:32:15.191194 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlq9w\" (UniqueName: \"kubernetes.io/projected/50a95e4d-3f8a-4e3e-982b-d05768d1ad14-kube-api-access-zlq9w\") pod \"openstack-operator-index-nzdrj\" (UID: \"50a95e4d-3f8a-4e3e-982b-d05768d1ad14\") " pod="openstack-operators/openstack-operator-index-nzdrj" Mar 19 15:32:15 crc kubenswrapper[4771]: I0319 15:32:15.384654 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nzdrj" Mar 19 15:32:15 crc kubenswrapper[4771]: I0319 15:32:15.479066 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-z4zw5" Mar 19 15:32:15 crc kubenswrapper[4771]: I0319 15:32:15.546580 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-z4zw5" Mar 19 15:32:15 crc kubenswrapper[4771]: I0319 15:32:15.917536 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nzdrj"] Mar 19 15:32:16 crc kubenswrapper[4771]: I0319 15:32:16.110002 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nzdrj" event={"ID":"50a95e4d-3f8a-4e3e-982b-d05768d1ad14","Type":"ContainerStarted","Data":"20b75383f499c82613adca74d01e0ee40469606dcce25b3d3b7743b2a94b9692"} Mar 19 15:32:17 crc kubenswrapper[4771]: I0319 15:32:17.428426 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-nzdrj"] Mar 19 15:32:17 crc kubenswrapper[4771]: I0319 15:32:17.836320 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-xwfgg"] Mar 19 15:32:17 crc kubenswrapper[4771]: I0319 15:32:17.837356 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xwfgg" Mar 19 15:32:17 crc kubenswrapper[4771]: I0319 15:32:17.852328 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xwfgg"] Mar 19 15:32:17 crc kubenswrapper[4771]: I0319 15:32:17.921884 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9pd5\" (UniqueName: \"kubernetes.io/projected/15fecac0-77e3-4e3f-85db-33b3a4fa0232-kube-api-access-z9pd5\") pod \"openstack-operator-index-xwfgg\" (UID: \"15fecac0-77e3-4e3f-85db-33b3a4fa0232\") " pod="openstack-operators/openstack-operator-index-xwfgg" Mar 19 15:32:18 crc kubenswrapper[4771]: I0319 15:32:18.023356 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9pd5\" (UniqueName: \"kubernetes.io/projected/15fecac0-77e3-4e3f-85db-33b3a4fa0232-kube-api-access-z9pd5\") pod \"openstack-operator-index-xwfgg\" (UID: \"15fecac0-77e3-4e3f-85db-33b3a4fa0232\") " pod="openstack-operators/openstack-operator-index-xwfgg" Mar 19 15:32:18 crc kubenswrapper[4771]: I0319 15:32:18.045352 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9pd5\" (UniqueName: \"kubernetes.io/projected/15fecac0-77e3-4e3f-85db-33b3a4fa0232-kube-api-access-z9pd5\") pod \"openstack-operator-index-xwfgg\" (UID: \"15fecac0-77e3-4e3f-85db-33b3a4fa0232\") " pod="openstack-operators/openstack-operator-index-xwfgg" Mar 19 15:32:18 crc kubenswrapper[4771]: I0319 15:32:18.156830 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xwfgg" Mar 19 15:32:18 crc kubenswrapper[4771]: I0319 15:32:18.527808 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xwfgg"] Mar 19 15:32:18 crc kubenswrapper[4771]: W0319 15:32:18.534552 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15fecac0_77e3_4e3f_85db_33b3a4fa0232.slice/crio-41811fa3fc6d6cf8723dfcbc70baaa36f65718cc4bb1b9820cd3b480b6c1494f WatchSource:0}: Error finding container 41811fa3fc6d6cf8723dfcbc70baaa36f65718cc4bb1b9820cd3b480b6c1494f: Status 404 returned error can't find the container with id 41811fa3fc6d6cf8723dfcbc70baaa36f65718cc4bb1b9820cd3b480b6c1494f Mar 19 15:32:19 crc kubenswrapper[4771]: I0319 15:32:19.138054 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xwfgg" event={"ID":"15fecac0-77e3-4e3f-85db-33b3a4fa0232","Type":"ContainerStarted","Data":"803323788ef3ce03cf5eda6c687fef64c2320c9f8d8c3dc3db2cc69db7fed1d5"} Mar 19 15:32:19 crc kubenswrapper[4771]: I0319 15:32:19.138166 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xwfgg" event={"ID":"15fecac0-77e3-4e3f-85db-33b3a4fa0232","Type":"ContainerStarted","Data":"41811fa3fc6d6cf8723dfcbc70baaa36f65718cc4bb1b9820cd3b480b6c1494f"} Mar 19 15:32:19 crc kubenswrapper[4771]: I0319 15:32:19.140674 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nzdrj" event={"ID":"50a95e4d-3f8a-4e3e-982b-d05768d1ad14","Type":"ContainerStarted","Data":"bf8695571eb6b567f79ee87f173235758474dcc8e33d6a5db0247edb92d0b5d8"} Mar 19 15:32:19 crc kubenswrapper[4771]: I0319 15:32:19.141051 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-nzdrj" podUID="50a95e4d-3f8a-4e3e-982b-d05768d1ad14" containerName="registry-server" containerID="cri-o://bf8695571eb6b567f79ee87f173235758474dcc8e33d6a5db0247edb92d0b5d8" gracePeriod=2 Mar 19 15:32:19 crc kubenswrapper[4771]: I0319 15:32:19.172102 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-xwfgg" podStartSLOduration=2.119933855 podStartE2EDuration="2.17207581s" podCreationTimestamp="2026-03-19 15:32:17 +0000 UTC" firstStartedPulling="2026-03-19 15:32:18.538799858 +0000 UTC m=+997.767421060" lastFinishedPulling="2026-03-19 15:32:18.590941813 +0000 UTC m=+997.819563015" observedRunningTime="2026-03-19 15:32:19.167693995 +0000 UTC m=+998.396315207" watchObservedRunningTime="2026-03-19 15:32:19.17207581 +0000 UTC m=+998.400697022" Mar 19 15:32:19 crc kubenswrapper[4771]: I0319 15:32:19.193757 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-nzdrj" podStartSLOduration=1.81507234 podStartE2EDuration="4.193732941s" podCreationTimestamp="2026-03-19 15:32:15 +0000 UTC" firstStartedPulling="2026-03-19 15:32:15.928122163 +0000 UTC m=+995.156743385" lastFinishedPulling="2026-03-19 15:32:18.306782774 +0000 UTC m=+997.535403986" observedRunningTime="2026-03-19 15:32:19.193354092 +0000 UTC m=+998.421975314" watchObservedRunningTime="2026-03-19 15:32:19.193732941 +0000 UTC m=+998.422354183" Mar 19 15:32:19 crc kubenswrapper[4771]: I0319 15:32:19.606938 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nzdrj" Mar 19 15:32:19 crc kubenswrapper[4771]: I0319 15:32:19.644899 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlq9w\" (UniqueName: \"kubernetes.io/projected/50a95e4d-3f8a-4e3e-982b-d05768d1ad14-kube-api-access-zlq9w\") pod \"50a95e4d-3f8a-4e3e-982b-d05768d1ad14\" (UID: \"50a95e4d-3f8a-4e3e-982b-d05768d1ad14\") " Mar 19 15:32:19 crc kubenswrapper[4771]: I0319 15:32:19.652970 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50a95e4d-3f8a-4e3e-982b-d05768d1ad14-kube-api-access-zlq9w" (OuterVolumeSpecName: "kube-api-access-zlq9w") pod "50a95e4d-3f8a-4e3e-982b-d05768d1ad14" (UID: "50a95e4d-3f8a-4e3e-982b-d05768d1ad14"). InnerVolumeSpecName "kube-api-access-zlq9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:32:19 crc kubenswrapper[4771]: I0319 15:32:19.747831 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlq9w\" (UniqueName: \"kubernetes.io/projected/50a95e4d-3f8a-4e3e-982b-d05768d1ad14-kube-api-access-zlq9w\") on node \"crc\" DevicePath \"\"" Mar 19 15:32:20 crc kubenswrapper[4771]: I0319 15:32:20.152939 4771 generic.go:334] "Generic (PLEG): container finished" podID="50a95e4d-3f8a-4e3e-982b-d05768d1ad14" containerID="bf8695571eb6b567f79ee87f173235758474dcc8e33d6a5db0247edb92d0b5d8" exitCode=0 Mar 19 15:32:20 crc kubenswrapper[4771]: I0319 15:32:20.153072 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nzdrj" event={"ID":"50a95e4d-3f8a-4e3e-982b-d05768d1ad14","Type":"ContainerDied","Data":"bf8695571eb6b567f79ee87f173235758474dcc8e33d6a5db0247edb92d0b5d8"} Mar 19 15:32:20 crc kubenswrapper[4771]: I0319 15:32:20.153177 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nzdrj" event={"ID":"50a95e4d-3f8a-4e3e-982b-d05768d1ad14","Type":"ContainerDied","Data":"20b75383f499c82613adca74d01e0ee40469606dcce25b3d3b7743b2a94b9692"} Mar 19 15:32:20 crc kubenswrapper[4771]: I0319 15:32:20.153216 4771 scope.go:117] "RemoveContainer" containerID="bf8695571eb6b567f79ee87f173235758474dcc8e33d6a5db0247edb92d0b5d8" Mar 19 15:32:20 crc kubenswrapper[4771]: I0319 15:32:20.154173 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nzdrj" Mar 19 15:32:20 crc kubenswrapper[4771]: I0319 15:32:20.185413 4771 scope.go:117] "RemoveContainer" containerID="bf8695571eb6b567f79ee87f173235758474dcc8e33d6a5db0247edb92d0b5d8" Mar 19 15:32:20 crc kubenswrapper[4771]: E0319 15:32:20.185964 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf8695571eb6b567f79ee87f173235758474dcc8e33d6a5db0247edb92d0b5d8\": container with ID starting with bf8695571eb6b567f79ee87f173235758474dcc8e33d6a5db0247edb92d0b5d8 not found: ID does not exist" containerID="bf8695571eb6b567f79ee87f173235758474dcc8e33d6a5db0247edb92d0b5d8" Mar 19 15:32:20 crc kubenswrapper[4771]: I0319 15:32:20.186091 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf8695571eb6b567f79ee87f173235758474dcc8e33d6a5db0247edb92d0b5d8"} err="failed to get container status \"bf8695571eb6b567f79ee87f173235758474dcc8e33d6a5db0247edb92d0b5d8\": rpc error: code = NotFound desc = could not find container \"bf8695571eb6b567f79ee87f173235758474dcc8e33d6a5db0247edb92d0b5d8\": container with ID starting with bf8695571eb6b567f79ee87f173235758474dcc8e33d6a5db0247edb92d0b5d8 not found: ID does not exist" Mar 19 15:32:20 crc kubenswrapper[4771]: I0319 15:32:20.209785 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-nzdrj"] Mar 19 15:32:20 crc kubenswrapper[4771]: I0319 15:32:20.219502 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-nzdrj"] Mar 19 15:32:21 crc kubenswrapper[4771]: I0319 15:32:21.071790 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-9mlqc" Mar 19 15:32:21 crc kubenswrapper[4771]: I0319 15:32:21.521769 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50a95e4d-3f8a-4e3e-982b-d05768d1ad14" path="/var/lib/kubelet/pods/50a95e4d-3f8a-4e3e-982b-d05768d1ad14/volumes" Mar 19 15:32:23 crc kubenswrapper[4771]: I0319 15:32:23.028042 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:32:23 crc kubenswrapper[4771]: I0319 15:32:23.028486 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:32:28 crc kubenswrapper[4771]: I0319 15:32:28.157644 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-xwfgg" Mar 19 15:32:28 crc kubenswrapper[4771]: I0319 15:32:28.157739 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-xwfgg" Mar 19 15:32:28 crc kubenswrapper[4771]: I0319 15:32:28.199461 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-xwfgg" Mar 19 15:32:28 crc kubenswrapper[4771]: I0319 15:32:28.259960 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-xwfgg" Mar 19 15:32:30 crc kubenswrapper[4771]: I0319 15:32:30.474210 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-z4zw5" Mar 19 15:32:41 crc kubenswrapper[4771]: I0319 15:32:41.482318 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4"] Mar 19 15:32:41 crc kubenswrapper[4771]: E0319 15:32:41.483284 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a95e4d-3f8a-4e3e-982b-d05768d1ad14" containerName="registry-server" Mar 19 15:32:41 crc kubenswrapper[4771]: I0319 15:32:41.483306 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a95e4d-3f8a-4e3e-982b-d05768d1ad14" containerName="registry-server" Mar 19 15:32:41 crc kubenswrapper[4771]: I0319 15:32:41.483514 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="50a95e4d-3f8a-4e3e-982b-d05768d1ad14" containerName="registry-server" Mar 19 15:32:41 crc kubenswrapper[4771]: I0319 15:32:41.484769 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4" Mar 19 15:32:41 crc kubenswrapper[4771]: I0319 15:32:41.487334 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-mvhhz" Mar 19 15:32:41 crc kubenswrapper[4771]: I0319 15:32:41.503834 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4"] Mar 19 15:32:41 crc kubenswrapper[4771]: I0319 15:32:41.597948 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/59ea6c13-a174-4ed2-bd52-8f5af5c11cfb-bundle\") pod \"db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4\" (UID: \"59ea6c13-a174-4ed2-bd52-8f5af5c11cfb\") " pod="openstack-operators/db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4" Mar 19 15:32:41 crc kubenswrapper[4771]: I0319 15:32:41.598405 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnz5f\" (UniqueName: \"kubernetes.io/projected/59ea6c13-a174-4ed2-bd52-8f5af5c11cfb-kube-api-access-rnz5f\") pod \"db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4\" (UID: \"59ea6c13-a174-4ed2-bd52-8f5af5c11cfb\") " pod="openstack-operators/db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4" Mar 19 15:32:41 crc kubenswrapper[4771]: I0319 15:32:41.598489 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/59ea6c13-a174-4ed2-bd52-8f5af5c11cfb-util\") pod \"db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4\" (UID: \"59ea6c13-a174-4ed2-bd52-8f5af5c11cfb\") " pod="openstack-operators/db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4" Mar 19 15:32:41 crc kubenswrapper[4771]: I0319 15:32:41.699132 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnz5f\" (UniqueName: \"kubernetes.io/projected/59ea6c13-a174-4ed2-bd52-8f5af5c11cfb-kube-api-access-rnz5f\") pod \"db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4\" (UID: \"59ea6c13-a174-4ed2-bd52-8f5af5c11cfb\") " pod="openstack-operators/db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4" Mar 19 15:32:41 crc kubenswrapper[4771]: I0319 15:32:41.699450 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/59ea6c13-a174-4ed2-bd52-8f5af5c11cfb-util\") pod \"db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4\" (UID: \"59ea6c13-a174-4ed2-bd52-8f5af5c11cfb\") " pod="openstack-operators/db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4" Mar 19 15:32:41 crc kubenswrapper[4771]: I0319 15:32:41.699668 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/59ea6c13-a174-4ed2-bd52-8f5af5c11cfb-bundle\") pod \"db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4\" (UID: \"59ea6c13-a174-4ed2-bd52-8f5af5c11cfb\") " pod="openstack-operators/db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4" Mar 19 15:32:41 crc kubenswrapper[4771]: I0319 15:32:41.700330 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/59ea6c13-a174-4ed2-bd52-8f5af5c11cfb-util\") pod \"db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4\" (UID: \"59ea6c13-a174-4ed2-bd52-8f5af5c11cfb\") " pod="openstack-operators/db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4" Mar 19 15:32:41 crc kubenswrapper[4771]: I0319 15:32:41.701463 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/59ea6c13-a174-4ed2-bd52-8f5af5c11cfb-bundle\") pod \"db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4\" (UID: \"59ea6c13-a174-4ed2-bd52-8f5af5c11cfb\") " pod="openstack-operators/db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4" Mar 19 15:32:41 crc kubenswrapper[4771]: I0319 15:32:41.719206 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnz5f\" (UniqueName: \"kubernetes.io/projected/59ea6c13-a174-4ed2-bd52-8f5af5c11cfb-kube-api-access-rnz5f\") pod \"db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4\" (UID: \"59ea6c13-a174-4ed2-bd52-8f5af5c11cfb\") " pod="openstack-operators/db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4" Mar 19 15:32:41 crc kubenswrapper[4771]: I0319 15:32:41.809771 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4" Mar 19 15:32:42 crc kubenswrapper[4771]: I0319 15:32:42.299042 4771 scope.go:117] "RemoveContainer" containerID="ee2c415f18a1be2bc58bdb26c16bff0696ade8ab40ab2f887a13888f13682768" Mar 19 15:32:42 crc kubenswrapper[4771]: I0319 15:32:42.309268 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4"] Mar 19 15:32:42 crc kubenswrapper[4771]: W0319 15:32:42.320238 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59ea6c13_a174_4ed2_bd52_8f5af5c11cfb.slice/crio-040c8702d01e9dfb92ac2af45449c7a32a96f15290788b6055fa829394c30c62 WatchSource:0}: Error finding container 040c8702d01e9dfb92ac2af45449c7a32a96f15290788b6055fa829394c30c62: Status 404 returned error can't find the container with id 040c8702d01e9dfb92ac2af45449c7a32a96f15290788b6055fa829394c30c62 Mar 19 15:32:43 crc kubenswrapper[4771]: I0319 15:32:43.330726 4771 generic.go:334] "Generic (PLEG): container finished" podID="59ea6c13-a174-4ed2-bd52-8f5af5c11cfb" containerID="dc18df841cf92a382f0292be07fce03c79ee49a4998f862b4c99292fcbf40dcd" exitCode=0 Mar 19 15:32:43 crc kubenswrapper[4771]: I0319 15:32:43.330852 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4" event={"ID":"59ea6c13-a174-4ed2-bd52-8f5af5c11cfb","Type":"ContainerDied","Data":"dc18df841cf92a382f0292be07fce03c79ee49a4998f862b4c99292fcbf40dcd"} Mar 19 15:32:43 crc kubenswrapper[4771]: I0319 15:32:43.331063 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4" event={"ID":"59ea6c13-a174-4ed2-bd52-8f5af5c11cfb","Type":"ContainerStarted","Data":"040c8702d01e9dfb92ac2af45449c7a32a96f15290788b6055fa829394c30c62"} Mar 19 15:32:44 crc kubenswrapper[4771]: I0319 15:32:44.339435 4771 generic.go:334] "Generic (PLEG): container finished" podID="59ea6c13-a174-4ed2-bd52-8f5af5c11cfb" containerID="44bf7392e8b81e8de4cbf5ac8df753e1aea127e191fd326c3a4324d4200a557e" exitCode=0 Mar 19 15:32:44 crc kubenswrapper[4771]: I0319 15:32:44.339488 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4" event={"ID":"59ea6c13-a174-4ed2-bd52-8f5af5c11cfb","Type":"ContainerDied","Data":"44bf7392e8b81e8de4cbf5ac8df753e1aea127e191fd326c3a4324d4200a557e"} Mar 19 15:32:45 crc kubenswrapper[4771]: I0319 15:32:45.361565 4771 generic.go:334] "Generic (PLEG): container finished" podID="59ea6c13-a174-4ed2-bd52-8f5af5c11cfb" containerID="1af6400ef86a7f2b7032c6cb0bc755c68bda698cdc26befcb1c765b789c9b5a7" exitCode=0 Mar 19 15:32:45 crc kubenswrapper[4771]: I0319 15:32:45.361654 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4" event={"ID":"59ea6c13-a174-4ed2-bd52-8f5af5c11cfb","Type":"ContainerDied","Data":"1af6400ef86a7f2b7032c6cb0bc755c68bda698cdc26befcb1c765b789c9b5a7"} Mar 19 15:32:46 crc kubenswrapper[4771]: I0319 15:32:46.707433 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4" Mar 19 15:32:46 crc kubenswrapper[4771]: I0319 15:32:46.813569 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/59ea6c13-a174-4ed2-bd52-8f5af5c11cfb-bundle\") pod \"59ea6c13-a174-4ed2-bd52-8f5af5c11cfb\" (UID: \"59ea6c13-a174-4ed2-bd52-8f5af5c11cfb\") " Mar 19 15:32:46 crc kubenswrapper[4771]: I0319 15:32:46.813715 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/59ea6c13-a174-4ed2-bd52-8f5af5c11cfb-util\") pod \"59ea6c13-a174-4ed2-bd52-8f5af5c11cfb\" (UID: \"59ea6c13-a174-4ed2-bd52-8f5af5c11cfb\") " Mar 19 15:32:46 crc kubenswrapper[4771]: I0319 15:32:46.813759 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnz5f\" (UniqueName: \"kubernetes.io/projected/59ea6c13-a174-4ed2-bd52-8f5af5c11cfb-kube-api-access-rnz5f\") pod \"59ea6c13-a174-4ed2-bd52-8f5af5c11cfb\" (UID: \"59ea6c13-a174-4ed2-bd52-8f5af5c11cfb\") " Mar 19 15:32:46 crc kubenswrapper[4771]: I0319 15:32:46.814701 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59ea6c13-a174-4ed2-bd52-8f5af5c11cfb-bundle" (OuterVolumeSpecName: "bundle") pod "59ea6c13-a174-4ed2-bd52-8f5af5c11cfb" (UID: "59ea6c13-a174-4ed2-bd52-8f5af5c11cfb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:32:46 crc kubenswrapper[4771]: I0319 15:32:46.818857 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59ea6c13-a174-4ed2-bd52-8f5af5c11cfb-kube-api-access-rnz5f" (OuterVolumeSpecName: "kube-api-access-rnz5f") pod "59ea6c13-a174-4ed2-bd52-8f5af5c11cfb" (UID: "59ea6c13-a174-4ed2-bd52-8f5af5c11cfb"). InnerVolumeSpecName "kube-api-access-rnz5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:32:46 crc kubenswrapper[4771]: I0319 15:32:46.831541 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59ea6c13-a174-4ed2-bd52-8f5af5c11cfb-util" (OuterVolumeSpecName: "util") pod "59ea6c13-a174-4ed2-bd52-8f5af5c11cfb" (UID: "59ea6c13-a174-4ed2-bd52-8f5af5c11cfb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:32:46 crc kubenswrapper[4771]: I0319 15:32:46.915320 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnz5f\" (UniqueName: \"kubernetes.io/projected/59ea6c13-a174-4ed2-bd52-8f5af5c11cfb-kube-api-access-rnz5f\") on node \"crc\" DevicePath \"\"" Mar 19 15:32:46 crc kubenswrapper[4771]: I0319 15:32:46.915345 4771 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/59ea6c13-a174-4ed2-bd52-8f5af5c11cfb-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 15:32:46 crc kubenswrapper[4771]: I0319 15:32:46.915354 4771 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/59ea6c13-a174-4ed2-bd52-8f5af5c11cfb-util\") on node \"crc\" DevicePath \"\"" Mar 19 15:32:47 crc kubenswrapper[4771]: I0319 15:32:47.382283 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4" event={"ID":"59ea6c13-a174-4ed2-bd52-8f5af5c11cfb","Type":"ContainerDied","Data":"040c8702d01e9dfb92ac2af45449c7a32a96f15290788b6055fa829394c30c62"} Mar 19 15:32:47 crc kubenswrapper[4771]: I0319 15:32:47.382365 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="040c8702d01e9dfb92ac2af45449c7a32a96f15290788b6055fa829394c30c62" Mar 19 15:32:47 crc kubenswrapper[4771]: I0319 15:32:47.382462 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4" Mar 19 15:32:53 crc kubenswrapper[4771]: I0319 15:32:53.027741 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:32:53 crc kubenswrapper[4771]: I0319 15:32:53.028310 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:32:53 crc kubenswrapper[4771]: I0319 15:32:53.583205 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-5655c486f8-5kdjz"] Mar 19 15:32:53 crc kubenswrapper[4771]: E0319 15:32:53.583484 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59ea6c13-a174-4ed2-bd52-8f5af5c11cfb" containerName="pull" Mar 19 15:32:53 crc kubenswrapper[4771]: I0319 15:32:53.583504 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="59ea6c13-a174-4ed2-bd52-8f5af5c11cfb" containerName="pull" Mar 19 15:32:53 crc kubenswrapper[4771]: E0319 15:32:53.583521 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59ea6c13-a174-4ed2-bd52-8f5af5c11cfb" containerName="extract" Mar 19 15:32:53 crc kubenswrapper[4771]: I0319 15:32:53.583529 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="59ea6c13-a174-4ed2-bd52-8f5af5c11cfb" containerName="extract" Mar 19 15:32:53 crc kubenswrapper[4771]: E0319 15:32:53.583546 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59ea6c13-a174-4ed2-bd52-8f5af5c11cfb" containerName="util" Mar 19 15:32:53 crc kubenswrapper[4771]: I0319 15:32:53.583553 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="59ea6c13-a174-4ed2-bd52-8f5af5c11cfb" containerName="util" Mar 19 15:32:53 crc kubenswrapper[4771]: I0319 15:32:53.583693 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="59ea6c13-a174-4ed2-bd52-8f5af5c11cfb" containerName="extract" Mar 19 15:32:53 crc kubenswrapper[4771]: I0319 15:32:53.584185 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5655c486f8-5kdjz" Mar 19 15:32:53 crc kubenswrapper[4771]: I0319 15:32:53.586330 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-2rh9d" Mar 19 15:32:53 crc kubenswrapper[4771]: I0319 15:32:53.618918 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5655c486f8-5kdjz"] Mar 19 15:32:53 crc kubenswrapper[4771]: I0319 15:32:53.718880 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bpnw\" (UniqueName: \"kubernetes.io/projected/ddeeae87-c5db-4209-b4e2-8ad178a811fc-kube-api-access-7bpnw\") pod \"openstack-operator-controller-init-5655c486f8-5kdjz\" (UID: \"ddeeae87-c5db-4209-b4e2-8ad178a811fc\") " pod="openstack-operators/openstack-operator-controller-init-5655c486f8-5kdjz" Mar 19 15:32:53 crc kubenswrapper[4771]: I0319 15:32:53.820665 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bpnw\" (UniqueName: \"kubernetes.io/projected/ddeeae87-c5db-4209-b4e2-8ad178a811fc-kube-api-access-7bpnw\") pod \"openstack-operator-controller-init-5655c486f8-5kdjz\" (UID: \"ddeeae87-c5db-4209-b4e2-8ad178a811fc\") " pod="openstack-operators/openstack-operator-controller-init-5655c486f8-5kdjz" Mar 19 15:32:53 crc kubenswrapper[4771]: I0319 15:32:53.842635 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bpnw\" (UniqueName: \"kubernetes.io/projected/ddeeae87-c5db-4209-b4e2-8ad178a811fc-kube-api-access-7bpnw\") pod \"openstack-operator-controller-init-5655c486f8-5kdjz\" (UID: \"ddeeae87-c5db-4209-b4e2-8ad178a811fc\") " pod="openstack-operators/openstack-operator-controller-init-5655c486f8-5kdjz" Mar 19 15:32:53 crc kubenswrapper[4771]: I0319 15:32:53.899787 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5655c486f8-5kdjz" Mar 19 15:32:54 crc kubenswrapper[4771]: I0319 15:32:54.199559 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5655c486f8-5kdjz"] Mar 19 15:32:54 crc kubenswrapper[4771]: I0319 15:32:54.434977 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5655c486f8-5kdjz" event={"ID":"ddeeae87-c5db-4209-b4e2-8ad178a811fc","Type":"ContainerStarted","Data":"b3558b2b270060552d01f3c2d7402f8536aa8332bb6567cc8445579ad3184151"} Mar 19 15:32:58 crc kubenswrapper[4771]: I0319 15:32:58.463331 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5655c486f8-5kdjz" event={"ID":"ddeeae87-c5db-4209-b4e2-8ad178a811fc","Type":"ContainerStarted","Data":"1d4e367e285ecb86df08381eb09ce72794d2830d3cbbb59e1bd97e3b2f43f77b"} Mar 19 15:32:58 crc kubenswrapper[4771]: I0319 15:32:58.463931 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-5655c486f8-5kdjz" Mar 19 15:33:03 crc kubenswrapper[4771]: I0319 15:33:03.903670 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-5655c486f8-5kdjz" Mar 19 15:33:03 crc kubenswrapper[4771]: I0319 15:33:03.940734 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-5655c486f8-5kdjz" podStartSLOduration=7.471125511 podStartE2EDuration="10.940640048s" podCreationTimestamp="2026-03-19 15:32:53 +0000 UTC" firstStartedPulling="2026-03-19 15:32:54.22228137 +0000 UTC m=+1033.450902572" lastFinishedPulling="2026-03-19 15:32:57.691795897 +0000 UTC m=+1036.920417109" observedRunningTime="2026-03-19 15:32:58.503292399 +0000 UTC m=+1037.731913641" watchObservedRunningTime="2026-03-19 15:33:03.940640048 +0000 UTC m=+1043.169261260" Mar 19 15:33:23 crc kubenswrapper[4771]: I0319 15:33:23.027412 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:33:23 crc kubenswrapper[4771]: I0319 15:33:23.027978 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:33:23 crc kubenswrapper[4771]: I0319 15:33:23.028101 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" Mar 19 15:33:23 crc kubenswrapper[4771]: I0319 15:33:23.029124 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ed5a553c2d92c915ce47410116c0fc185162ea3ab77f14a7e2453e14985c8a40"} pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 15:33:23 crc kubenswrapper[4771]: I0319 15:33:23.029216 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" containerID="cri-o://ed5a553c2d92c915ce47410116c0fc185162ea3ab77f14a7e2453e14985c8a40" gracePeriod=600 Mar 19 15:33:23 crc kubenswrapper[4771]: I0319 15:33:23.641627 4771 generic.go:334] "Generic (PLEG): container finished" podID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerID="ed5a553c2d92c915ce47410116c0fc185162ea3ab77f14a7e2453e14985c8a40" exitCode=0 Mar 19 15:33:23 crc kubenswrapper[4771]: I0319 15:33:23.641709 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" event={"ID":"f2b6e948-bbef-4217-b0eb-4cdbf711037c","Type":"ContainerDied","Data":"ed5a553c2d92c915ce47410116c0fc185162ea3ab77f14a7e2453e14985c8a40"} Mar 19 15:33:23 crc kubenswrapper[4771]: I0319 15:33:23.642543 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" event={"ID":"f2b6e948-bbef-4217-b0eb-4cdbf711037c","Type":"ContainerStarted","Data":"953ae2c967341b87ceb085ed4fd2e6023f2aede65dc55a12a3a59811b0300199"} Mar 19 15:33:23 crc kubenswrapper[4771]: I0319 15:33:23.642638 4771 scope.go:117] "RemoveContainer" containerID="b1c74e779cae295d8261f06e3e0b3804798c6e48c30f016b0eaeacf144fab8c8" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.304031 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-x9hpx"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.306005 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-x9hpx" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.308475 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-vlvvr" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.315604 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-x9hpx"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.319765 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-xllmw"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.320476 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-xllmw" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.321842 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-5chrb" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.332065 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-xllmw"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.344617 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-d4wwp"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.345558 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-d4wwp" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.351805 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-pjpwz" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.376132 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-9mp28"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.376908 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-9mp28" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.380275 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-d4wwp"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.381274 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-rhkl4" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.384737 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-9mp28"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.404509 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-7p6l9"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.419243 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-7p6l9" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.422705 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-m9pzx" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.430279 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8whv6\" (UniqueName: \"kubernetes.io/projected/4717b1db-fd1d-4e9f-b04d-b88488b35369-kube-api-access-8whv6\") pod \"barbican-operator-controller-manager-59bc569d95-x9hpx\" (UID: \"4717b1db-fd1d-4e9f-b04d-b88488b35369\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-x9hpx" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.454631 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-hgvwc"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.455867 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hgvwc" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.461049 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-4xswf" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.474699 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-7p6l9"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.485332 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-hgvwc"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.500824 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-5xksc"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.501701 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-5xksc" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.505541 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-f58m8" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.524142 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b487c85ff-rdr25"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.524854 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-5xksc"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.525008 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b487c85ff-rdr25" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.527160 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-npzxs" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.527454 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.534121 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fbsk\" (UniqueName: \"kubernetes.io/projected/297266bf-7ed9-43bf-abfa-d608acf96290-kube-api-access-2fbsk\") pod \"designate-operator-controller-manager-588d4d986b-d4wwp\" (UID: \"297266bf-7ed9-43bf-abfa-d608acf96290\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-d4wwp" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.534171 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhxsd\" (UniqueName: \"kubernetes.io/projected/f10d9b25-6f62-4300-9827-bebe80433dda-kube-api-access-mhxsd\") pod \"heat-operator-controller-manager-67dd5f86f5-7p6l9\" (UID: \"f10d9b25-6f62-4300-9827-bebe80433dda\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-7p6l9" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.534187 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992-cert\") pod \"infra-operator-controller-manager-7b487c85ff-rdr25\" (UID: \"6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992\") " pod="openstack-operators/infra-operator-controller-manager-7b487c85ff-rdr25" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.534233 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnw6b\" (UniqueName: \"kubernetes.io/projected/682bc21b-ae46-487c-b1f8-a8626914fff4-kube-api-access-pnw6b\") pod \"ironic-operator-controller-manager-6f787dddc9-5xksc\" (UID: \"682bc21b-ae46-487c-b1f8-a8626914fff4\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-5xksc" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.534250 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgf7z\" (UniqueName: \"kubernetes.io/projected/bb45e644-93ea-41f8-96b5-bf1765f44488-kube-api-access-kgf7z\") pod \"horizon-operator-controller-manager-8464cc45fb-hgvwc\" (UID: \"bb45e644-93ea-41f8-96b5-bf1765f44488\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hgvwc" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.534272 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8whv6\" (UniqueName: \"kubernetes.io/projected/4717b1db-fd1d-4e9f-b04d-b88488b35369-kube-api-access-8whv6\") pod \"barbican-operator-controller-manager-59bc569d95-x9hpx\" (UID: \"4717b1db-fd1d-4e9f-b04d-b88488b35369\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-x9hpx" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.534292 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kt82\" (UniqueName: \"kubernetes.io/projected/6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992-kube-api-access-8kt82\") pod \"infra-operator-controller-manager-7b487c85ff-rdr25\" (UID: \"6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992\") " pod="openstack-operators/infra-operator-controller-manager-7b487c85ff-rdr25" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.534310 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2hxq\" (UniqueName: \"kubernetes.io/projected/3701ec62-21e3-4bb7-8e32-c09fb4c5d619-kube-api-access-r2hxq\") pod \"cinder-operator-controller-manager-8d58dc466-xllmw\" (UID: \"3701ec62-21e3-4bb7-8e32-c09fb4c5d619\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-xllmw" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.534330 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd8nd\" (UniqueName: \"kubernetes.io/projected/8d301298-e7fb-4ce7-8369-7d1887b6a913-kube-api-access-nd8nd\") pod \"glance-operator-controller-manager-79df6bcc97-9mp28\" (UID: \"8d301298-e7fb-4ce7-8369-7d1887b6a913\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-9mp28" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.552041 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b487c85ff-rdr25"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.561474 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-knffk"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.561929 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8whv6\" (UniqueName: \"kubernetes.io/projected/4717b1db-fd1d-4e9f-b04d-b88488b35369-kube-api-access-8whv6\") pod \"barbican-operator-controller-manager-59bc569d95-x9hpx\" (UID: \"4717b1db-fd1d-4e9f-b04d-b88488b35369\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-x9hpx" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.562379 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-knffk" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.564270 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-hn9x4" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.575220 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-t7mdc"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.576480 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-t7mdc" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.580057 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-knffk"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.592644 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-vpkjr" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.594059 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-w4pll"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.595187 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-w4pll" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.599887 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-kr2wk" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.617137 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-t7mdc"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.631197 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-x9hpx" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.638012 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fbsk\" (UniqueName: \"kubernetes.io/projected/297266bf-7ed9-43bf-abfa-d608acf96290-kube-api-access-2fbsk\") pod \"designate-operator-controller-manager-588d4d986b-d4wwp\" (UID: \"297266bf-7ed9-43bf-abfa-d608acf96290\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-d4wwp" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.638060 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcrwz\" (UniqueName: \"kubernetes.io/projected/3196e589-b895-4c94-aa2b-1b4a1b0786cf-kube-api-access-rcrwz\") pod \"manila-operator-controller-manager-55f864c847-t7mdc\" (UID: \"3196e589-b895-4c94-aa2b-1b4a1b0786cf\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-t7mdc" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.638084 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rblf\" (UniqueName: \"kubernetes.io/projected/6c6e6d57-bc57-4368-9b9f-ce85dbf99b46-kube-api-access-6rblf\") pod \"mariadb-operator-controller-manager-67ccfc9778-w4pll\" (UID: \"6c6e6d57-bc57-4368-9b9f-ce85dbf99b46\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-w4pll" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.638111 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhxsd\" (UniqueName: \"kubernetes.io/projected/f10d9b25-6f62-4300-9827-bebe80433dda-kube-api-access-mhxsd\") pod \"heat-operator-controller-manager-67dd5f86f5-7p6l9\" (UID: \"f10d9b25-6f62-4300-9827-bebe80433dda\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-7p6l9" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.638130 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992-cert\") pod \"infra-operator-controller-manager-7b487c85ff-rdr25\" (UID: \"6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992\") " pod="openstack-operators/infra-operator-controller-manager-7b487c85ff-rdr25" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.638160 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnw6b\" (UniqueName: \"kubernetes.io/projected/682bc21b-ae46-487c-b1f8-a8626914fff4-kube-api-access-pnw6b\") pod \"ironic-operator-controller-manager-6f787dddc9-5xksc\" (UID: \"682bc21b-ae46-487c-b1f8-a8626914fff4\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-5xksc" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.638176 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgf7z\" (UniqueName: \"kubernetes.io/projected/bb45e644-93ea-41f8-96b5-bf1765f44488-kube-api-access-kgf7z\") pod \"horizon-operator-controller-manager-8464cc45fb-hgvwc\" (UID: \"bb45e644-93ea-41f8-96b5-bf1765f44488\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hgvwc" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.638195 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lwd4\" (UniqueName: \"kubernetes.io/projected/3fc80c77-8c96-492a-8da1-4e617cfc2878-kube-api-access-4lwd4\") pod \"keystone-operator-controller-manager-768b96df4c-knffk\" (UID: \"3fc80c77-8c96-492a-8da1-4e617cfc2878\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-knffk" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.638215 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kt82\" (UniqueName: \"kubernetes.io/projected/6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992-kube-api-access-8kt82\") pod \"infra-operator-controller-manager-7b487c85ff-rdr25\" (UID: \"6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992\") " pod="openstack-operators/infra-operator-controller-manager-7b487c85ff-rdr25" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.638234 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2hxq\" (UniqueName: \"kubernetes.io/projected/3701ec62-21e3-4bb7-8e32-c09fb4c5d619-kube-api-access-r2hxq\") pod \"cinder-operator-controller-manager-8d58dc466-xllmw\" (UID: \"3701ec62-21e3-4bb7-8e32-c09fb4c5d619\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-xllmw" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.638256 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd8nd\" (UniqueName: \"kubernetes.io/projected/8d301298-e7fb-4ce7-8369-7d1887b6a913-kube-api-access-nd8nd\") pod \"glance-operator-controller-manager-79df6bcc97-9mp28\" (UID: \"8d301298-e7fb-4ce7-8369-7d1887b6a913\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-9mp28" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.638842 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-w4pll"] Mar 19 15:33:37 crc kubenswrapper[4771]: E0319 15:33:37.638843 4771 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 15:33:37 crc kubenswrapper[4771]: E0319 15:33:37.638921 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992-cert podName:6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992 nodeName:}" failed. No retries permitted until 2026-03-19 15:33:38.138896019 +0000 UTC m=+1077.367517221 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992-cert") pod "infra-operator-controller-manager-7b487c85ff-rdr25" (UID: "6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992") : secret "infra-operator-webhook-server-cert" not found Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.657079 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-mldfm"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.657864 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-mldfm" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.663660 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-kdvcf" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.668705 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fbsk\" (UniqueName: \"kubernetes.io/projected/297266bf-7ed9-43bf-abfa-d608acf96290-kube-api-access-2fbsk\") pod \"designate-operator-controller-manager-588d4d986b-d4wwp\" (UID: \"297266bf-7ed9-43bf-abfa-d608acf96290\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-d4wwp" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.668715 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2hxq\" (UniqueName: \"kubernetes.io/projected/3701ec62-21e3-4bb7-8e32-c09fb4c5d619-kube-api-access-r2hxq\") pod \"cinder-operator-controller-manager-8d58dc466-xllmw\" (UID: \"3701ec62-21e3-4bb7-8e32-c09fb4c5d619\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-xllmw" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.669223 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kt82\" (UniqueName: \"kubernetes.io/projected/6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992-kube-api-access-8kt82\") pod \"infra-operator-controller-manager-7b487c85ff-rdr25\" (UID: \"6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992\") " pod="openstack-operators/infra-operator-controller-manager-7b487c85ff-rdr25" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.679581 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-d4wwp" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.685528 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd8nd\" (UniqueName: \"kubernetes.io/projected/8d301298-e7fb-4ce7-8369-7d1887b6a913-kube-api-access-nd8nd\") pod \"glance-operator-controller-manager-79df6bcc97-9mp28\" (UID: \"8d301298-e7fb-4ce7-8369-7d1887b6a913\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-9mp28" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.686790 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgf7z\" (UniqueName: \"kubernetes.io/projected/bb45e644-93ea-41f8-96b5-bf1765f44488-kube-api-access-kgf7z\") pod \"horizon-operator-controller-manager-8464cc45fb-hgvwc\" (UID: \"bb45e644-93ea-41f8-96b5-bf1765f44488\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hgvwc" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.689727 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnw6b\" (UniqueName: \"kubernetes.io/projected/682bc21b-ae46-487c-b1f8-a8626914fff4-kube-api-access-pnw6b\") pod \"ironic-operator-controller-manager-6f787dddc9-5xksc\" (UID: \"682bc21b-ae46-487c-b1f8-a8626914fff4\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-5xksc" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.692587 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhxsd\" (UniqueName: \"kubernetes.io/projected/f10d9b25-6f62-4300-9827-bebe80433dda-kube-api-access-mhxsd\") pod \"heat-operator-controller-manager-67dd5f86f5-7p6l9\" (UID: \"f10d9b25-6f62-4300-9827-bebe80433dda\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-7p6l9" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.693125 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-9mp28" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.701510 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-5zj7h"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.702519 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5zj7h" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.714271 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-mldfm"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.716537 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-lrrjh" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.738774 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-xhqbn"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.739798 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-xhqbn" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.742749 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5gl7\" (UniqueName: \"kubernetes.io/projected/051674d9-53cb-4cbc-ae54-b6beb16456ee-kube-api-access-s5gl7\") pod \"nova-operator-controller-manager-5d488d59fb-5zj7h\" (UID: \"051674d9-53cb-4cbc-ae54-b6beb16456ee\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5zj7h" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.742816 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcrwz\" (UniqueName: \"kubernetes.io/projected/3196e589-b895-4c94-aa2b-1b4a1b0786cf-kube-api-access-rcrwz\") pod \"manila-operator-controller-manager-55f864c847-t7mdc\" (UID: \"3196e589-b895-4c94-aa2b-1b4a1b0786cf\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-t7mdc" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.742840 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rblf\" (UniqueName: \"kubernetes.io/projected/6c6e6d57-bc57-4368-9b9f-ce85dbf99b46-kube-api-access-6rblf\") pod \"mariadb-operator-controller-manager-67ccfc9778-w4pll\" (UID: \"6c6e6d57-bc57-4368-9b9f-ce85dbf99b46\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-w4pll" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.742879 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z9rk\" (UniqueName: \"kubernetes.io/projected/8caf5d5b-cffa-4b03-b9b9-7bd54217fda6-kube-api-access-4z9rk\") pod \"neutron-operator-controller-manager-767865f676-mldfm\" (UID: \"8caf5d5b-cffa-4b03-b9b9-7bd54217fda6\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-mldfm" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.742926 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdbt4\" (UniqueName: \"kubernetes.io/projected/14fa11c5-1371-4da1-aa9b-8b7b2463600e-kube-api-access-zdbt4\") pod \"octavia-operator-controller-manager-5b9f45d989-xhqbn\" (UID: \"14fa11c5-1371-4da1-aa9b-8b7b2463600e\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-xhqbn" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.742950 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lwd4\" (UniqueName: \"kubernetes.io/projected/3fc80c77-8c96-492a-8da1-4e617cfc2878-kube-api-access-4lwd4\") pod \"keystone-operator-controller-manager-768b96df4c-knffk\" (UID: \"3fc80c77-8c96-492a-8da1-4e617cfc2878\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-knffk" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.744036 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-jx4qf" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.744815 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-7p6l9" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.756841 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-5zj7h"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.768298 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcrwz\" (UniqueName: \"kubernetes.io/projected/3196e589-b895-4c94-aa2b-1b4a1b0786cf-kube-api-access-rcrwz\") pod \"manila-operator-controller-manager-55f864c847-t7mdc\" (UID: \"3196e589-b895-4c94-aa2b-1b4a1b0786cf\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-t7mdc" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.770499 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rblf\" (UniqueName: \"kubernetes.io/projected/6c6e6d57-bc57-4368-9b9f-ce85dbf99b46-kube-api-access-6rblf\") pod \"mariadb-operator-controller-manager-67ccfc9778-w4pll\" (UID: \"6c6e6d57-bc57-4368-9b9f-ce85dbf99b46\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-w4pll" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.771370 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lwd4\" (UniqueName: \"kubernetes.io/projected/3fc80c77-8c96-492a-8da1-4e617cfc2878-kube-api-access-4lwd4\") pod \"keystone-operator-controller-manager-768b96df4c-knffk\" (UID: \"3fc80c77-8c96-492a-8da1-4e617cfc2878\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-knffk" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.780131 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-xhqbn"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.789030 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hgvwc" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.791111 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4fwmj"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.791939 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4fwmj" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.795121 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-g9kn4" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.795277 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.824305 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-5xksc" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.835664 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-6s7qk"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.836814 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-6s7qk" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.839403 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-hd47d" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.846038 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5gl7\" (UniqueName: \"kubernetes.io/projected/051674d9-53cb-4cbc-ae54-b6beb16456ee-kube-api-access-s5gl7\") pod \"nova-operator-controller-manager-5d488d59fb-5zj7h\" (UID: \"051674d9-53cb-4cbc-ae54-b6beb16456ee\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5zj7h" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.846104 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z9rk\" (UniqueName: \"kubernetes.io/projected/8caf5d5b-cffa-4b03-b9b9-7bd54217fda6-kube-api-access-4z9rk\") pod \"neutron-operator-controller-manager-767865f676-mldfm\" (UID: \"8caf5d5b-cffa-4b03-b9b9-7bd54217fda6\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-mldfm" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.846146 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdbt4\" (UniqueName: \"kubernetes.io/projected/14fa11c5-1371-4da1-aa9b-8b7b2463600e-kube-api-access-zdbt4\") pod \"octavia-operator-controller-manager-5b9f45d989-xhqbn\" (UID: \"14fa11c5-1371-4da1-aa9b-8b7b2463600e\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-xhqbn" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.846548 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4fwmj"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.876610 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdbt4\" (UniqueName: \"kubernetes.io/projected/14fa11c5-1371-4da1-aa9b-8b7b2463600e-kube-api-access-zdbt4\") pod \"octavia-operator-controller-manager-5b9f45d989-xhqbn\" (UID: \"14fa11c5-1371-4da1-aa9b-8b7b2463600e\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-xhqbn" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.881100 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-rbjxs"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.882056 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-rbjxs" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.882248 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5gl7\" (UniqueName: \"kubernetes.io/projected/051674d9-53cb-4cbc-ae54-b6beb16456ee-kube-api-access-s5gl7\") pod \"nova-operator-controller-manager-5d488d59fb-5zj7h\" (UID: \"051674d9-53cb-4cbc-ae54-b6beb16456ee\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5zj7h" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.883369 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z9rk\" (UniqueName: \"kubernetes.io/projected/8caf5d5b-cffa-4b03-b9b9-7bd54217fda6-kube-api-access-4z9rk\") pod \"neutron-operator-controller-manager-767865f676-mldfm\" (UID: \"8caf5d5b-cffa-4b03-b9b9-7bd54217fda6\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-mldfm" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.888014 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-z9vjc" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.901617 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-6s7qk"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.907790 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-r479m"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.910939 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-r479m" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.915233 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-knffk" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.920023 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-c2x58" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.927081 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-r479m"] Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.944561 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-xllmw" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.944924 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-t7mdc" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.952470 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6mbs\" (UniqueName: \"kubernetes.io/projected/0200604c-cbeb-45ea-9f92-b5f857d05b23-kube-api-access-k6mbs\") pod \"ovn-operator-controller-manager-884679f54-6s7qk\" (UID: \"0200604c-cbeb-45ea-9f92-b5f857d05b23\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-6s7qk" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.952720 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksx64\" (UniqueName: \"kubernetes.io/projected/7ccc7b81-b1de-48e8-aec9-f88f615ebf88-kube-api-access-ksx64\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-4fwmj\" (UID: \"7ccc7b81-b1de-48e8-aec9-f88f615ebf88\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4fwmj" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.952906 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ccc7b81-b1de-48e8-aec9-f88f615ebf88-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-4fwmj\" (UID: \"7ccc7b81-b1de-48e8-aec9-f88f615ebf88\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4fwmj" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.960706 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-w4pll" Mar 19 15:33:37 crc kubenswrapper[4771]: I0319 15:33:37.983330 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-rbjxs"] Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.016128 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-zcdvr"] Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.017057 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-zcdvr" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.032198 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-d8g7x" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.037695 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-zcdvr"] Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.054345 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-449dp\" (UniqueName: \"kubernetes.io/projected/2a2f4027-c0c8-4032-9e40-ab2ce99c899f-kube-api-access-449dp\") pod \"swift-operator-controller-manager-c674c5965-r479m\" (UID: \"2a2f4027-c0c8-4032-9e40-ab2ce99c899f\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-r479m" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.054409 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksx64\" (UniqueName: \"kubernetes.io/projected/7ccc7b81-b1de-48e8-aec9-f88f615ebf88-kube-api-access-ksx64\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-4fwmj\" (UID: \"7ccc7b81-b1de-48e8-aec9-f88f615ebf88\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4fwmj" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.054467 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ccc7b81-b1de-48e8-aec9-f88f615ebf88-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-4fwmj\" (UID: \"7ccc7b81-b1de-48e8-aec9-f88f615ebf88\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4fwmj" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.054522 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v824j\" (UniqueName: \"kubernetes.io/projected/ade0d786-88b7-465a-917a-7147ae923a01-kube-api-access-v824j\") pod \"placement-operator-controller-manager-5784578c99-rbjxs\" (UID: \"ade0d786-88b7-465a-917a-7147ae923a01\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-rbjxs" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.054544 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6mbs\" (UniqueName: \"kubernetes.io/projected/0200604c-cbeb-45ea-9f92-b5f857d05b23-kube-api-access-k6mbs\") pod \"ovn-operator-controller-manager-884679f54-6s7qk\" (UID: \"0200604c-cbeb-45ea-9f92-b5f857d05b23\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-6s7qk" Mar 19 15:33:38 crc kubenswrapper[4771]: E0319 15:33:38.054886 4771 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 15:33:38 crc kubenswrapper[4771]: E0319 15:33:38.054932 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ccc7b81-b1de-48e8-aec9-f88f615ebf88-cert podName:7ccc7b81-b1de-48e8-aec9-f88f615ebf88 nodeName:}" failed. No retries permitted until 2026-03-19 15:33:38.554920092 +0000 UTC m=+1077.783541294 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7ccc7b81-b1de-48e8-aec9-f88f615ebf88-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-4fwmj" (UID: "7ccc7b81-b1de-48e8-aec9-f88f615ebf88") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.066444 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-g6nkz"] Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.067692 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-g6nkz" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.078662 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-jhswb" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.079619 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksx64\" (UniqueName: \"kubernetes.io/projected/7ccc7b81-b1de-48e8-aec9-f88f615ebf88-kube-api-access-ksx64\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-4fwmj\" (UID: \"7ccc7b81-b1de-48e8-aec9-f88f615ebf88\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4fwmj" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.086605 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-mldfm" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.093688 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-g6nkz"] Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.104221 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5zj7h" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.124581 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-xhqbn" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.135304 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-p4bjm"] Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.136457 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-p4bjm" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.137632 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6mbs\" (UniqueName: \"kubernetes.io/projected/0200604c-cbeb-45ea-9f92-b5f857d05b23-kube-api-access-k6mbs\") pod \"ovn-operator-controller-manager-884679f54-6s7qk\" (UID: \"0200604c-cbeb-45ea-9f92-b5f857d05b23\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-6s7qk" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.138733 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-fxck5" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.152074 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-p4bjm"] Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.155394 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjq6x\" (UniqueName: \"kubernetes.io/projected/164abed1-8fb0-4276-acc5-08c87a08ba9a-kube-api-access-xjq6x\") pod \"telemetry-operator-controller-manager-d6b694c5-zcdvr\" (UID: \"164abed1-8fb0-4276-acc5-08c87a08ba9a\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-zcdvr" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.155456 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992-cert\") pod \"infra-operator-controller-manager-7b487c85ff-rdr25\" (UID: \"6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992\") " pod="openstack-operators/infra-operator-controller-manager-7b487c85ff-rdr25" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.155484 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9m57\" (UniqueName: \"kubernetes.io/projected/6d221d10-3c53-4137-88fc-8905e46b397c-kube-api-access-t9m57\") pod \"test-operator-controller-manager-5c5cb9c4d7-g6nkz\" (UID: \"6d221d10-3c53-4137-88fc-8905e46b397c\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-g6nkz" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.155538 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v824j\" (UniqueName: \"kubernetes.io/projected/ade0d786-88b7-465a-917a-7147ae923a01-kube-api-access-v824j\") pod \"placement-operator-controller-manager-5784578c99-rbjxs\" (UID: \"ade0d786-88b7-465a-917a-7147ae923a01\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-rbjxs" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.155614 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-449dp\" (UniqueName: \"kubernetes.io/projected/2a2f4027-c0c8-4032-9e40-ab2ce99c899f-kube-api-access-449dp\") pod \"swift-operator-controller-manager-c674c5965-r479m\" (UID: \"2a2f4027-c0c8-4032-9e40-ab2ce99c899f\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-r479m" Mar 19 15:33:38 crc kubenswrapper[4771]: E0319 15:33:38.155883 4771 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 15:33:38 crc kubenswrapper[4771]: E0319 15:33:38.155966 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992-cert podName:6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992 nodeName:}" failed. No retries permitted until 2026-03-19 15:33:39.155945093 +0000 UTC m=+1078.384566295 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992-cert") pod "infra-operator-controller-manager-7b487c85ff-rdr25" (UID: "6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992") : secret "infra-operator-webhook-server-cert" not found Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.164451 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6"] Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.165456 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.169156 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.169315 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.169424 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-nz5nr" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.177361 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v824j\" (UniqueName: \"kubernetes.io/projected/ade0d786-88b7-465a-917a-7147ae923a01-kube-api-access-v824j\") pod \"placement-operator-controller-manager-5784578c99-rbjxs\" (UID: \"ade0d786-88b7-465a-917a-7147ae923a01\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-rbjxs" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.178027 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6"] Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.181370 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-6s7qk" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.196343 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jrbt9"] Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.197684 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jrbt9" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.202290 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-sddkh" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.208951 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jrbt9"] Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.222549 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-449dp\" (UniqueName: \"kubernetes.io/projected/2a2f4027-c0c8-4032-9e40-ab2ce99c899f-kube-api-access-449dp\") pod \"swift-operator-controller-manager-c674c5965-r479m\" (UID: \"2a2f4027-c0c8-4032-9e40-ab2ce99c899f\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-r479m" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.223426 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-rbjxs" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.257719 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjq6x\" (UniqueName: \"kubernetes.io/projected/164abed1-8fb0-4276-acc5-08c87a08ba9a-kube-api-access-xjq6x\") pod \"telemetry-operator-controller-manager-d6b694c5-zcdvr\" (UID: \"164abed1-8fb0-4276-acc5-08c87a08ba9a\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-zcdvr" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.257761 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-webhook-certs\") pod \"openstack-operator-controller-manager-75b679d9b8-4nqx6\" (UID: \"3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d\") " pod="openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.257802 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9m57\" (UniqueName: \"kubernetes.io/projected/6d221d10-3c53-4137-88fc-8905e46b397c-kube-api-access-t9m57\") pod \"test-operator-controller-manager-5c5cb9c4d7-g6nkz\" (UID: \"6d221d10-3c53-4137-88fc-8905e46b397c\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-g6nkz" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.257821 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trf6c\" (UniqueName: \"kubernetes.io/projected/60385688-48aa-4671-9854-60eb4e36f072-kube-api-access-trf6c\") pod \"watcher-operator-controller-manager-6c4d75f7f9-p4bjm\" (UID: \"60385688-48aa-4671-9854-60eb4e36f072\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-p4bjm" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.257850 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-metrics-certs\") pod \"openstack-operator-controller-manager-75b679d9b8-4nqx6\" (UID: \"3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d\") " pod="openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.257871 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x8lt\" (UniqueName: \"kubernetes.io/projected/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-kube-api-access-6x8lt\") pod \"openstack-operator-controller-manager-75b679d9b8-4nqx6\" (UID: \"3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d\") " pod="openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.266322 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-x9hpx"] Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.281229 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9m57\" (UniqueName: \"kubernetes.io/projected/6d221d10-3c53-4137-88fc-8905e46b397c-kube-api-access-t9m57\") pod \"test-operator-controller-manager-5c5cb9c4d7-g6nkz\" (UID: \"6d221d10-3c53-4137-88fc-8905e46b397c\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-g6nkz" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.286471 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-r479m" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.288489 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjq6x\" (UniqueName: \"kubernetes.io/projected/164abed1-8fb0-4276-acc5-08c87a08ba9a-kube-api-access-xjq6x\") pod \"telemetry-operator-controller-manager-d6b694c5-zcdvr\" (UID: \"164abed1-8fb0-4276-acc5-08c87a08ba9a\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-zcdvr" Mar 19 15:33:38 crc kubenswrapper[4771]: W0319 15:33:38.309351 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4717b1db_fd1d_4e9f_b04d_b88488b35369.slice/crio-fbc9b55508a4deb704fd2418570c72948b3b730bf372f980ca931d6059dc1591 WatchSource:0}: Error finding container fbc9b55508a4deb704fd2418570c72948b3b730bf372f980ca931d6059dc1591: Status 404 returned error can't find the container with id fbc9b55508a4deb704fd2418570c72948b3b730bf372f980ca931d6059dc1591 Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.358900 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-webhook-certs\") pod \"openstack-operator-controller-manager-75b679d9b8-4nqx6\" (UID: \"3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d\") " pod="openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.358972 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trf6c\" (UniqueName: \"kubernetes.io/projected/60385688-48aa-4671-9854-60eb4e36f072-kube-api-access-trf6c\") pod \"watcher-operator-controller-manager-6c4d75f7f9-p4bjm\" (UID: \"60385688-48aa-4671-9854-60eb4e36f072\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-p4bjm" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.359047 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-metrics-certs\") pod \"openstack-operator-controller-manager-75b679d9b8-4nqx6\" (UID: \"3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d\") " pod="openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.359068 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x8lt\" (UniqueName: \"kubernetes.io/projected/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-kube-api-access-6x8lt\") pod \"openstack-operator-controller-manager-75b679d9b8-4nqx6\" (UID: \"3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d\") " pod="openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.359153 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9676\" (UniqueName: \"kubernetes.io/projected/d6cbfd2b-61bd-433e-95e8-8351340d720f-kube-api-access-s9676\") pod \"rabbitmq-cluster-operator-manager-668c99d594-jrbt9\" (UID: \"d6cbfd2b-61bd-433e-95e8-8351340d720f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jrbt9" Mar 19 15:33:38 crc kubenswrapper[4771]: E0319 15:33:38.359301 4771 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 15:33:38 crc kubenswrapper[4771]: E0319 15:33:38.359339 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-webhook-certs podName:3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d nodeName:}" failed. No retries permitted until 2026-03-19 15:33:38.859326788 +0000 UTC m=+1078.087947990 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-webhook-certs") pod "openstack-operator-controller-manager-75b679d9b8-4nqx6" (UID: "3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d") : secret "webhook-server-cert" not found Mar 19 15:33:38 crc kubenswrapper[4771]: E0319 15:33:38.359676 4771 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 15:33:38 crc kubenswrapper[4771]: E0319 15:33:38.359699 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-metrics-certs podName:3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d nodeName:}" failed. No retries permitted until 2026-03-19 15:33:38.859691928 +0000 UTC m=+1078.088313130 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-metrics-certs") pod "openstack-operator-controller-manager-75b679d9b8-4nqx6" (UID: "3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d") : secret "metrics-server-cert" not found Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.366829 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-zcdvr" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.380623 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x8lt\" (UniqueName: \"kubernetes.io/projected/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-kube-api-access-6x8lt\") pod \"openstack-operator-controller-manager-75b679d9b8-4nqx6\" (UID: \"3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d\") " pod="openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.390838 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trf6c\" (UniqueName: \"kubernetes.io/projected/60385688-48aa-4671-9854-60eb4e36f072-kube-api-access-trf6c\") pod \"watcher-operator-controller-manager-6c4d75f7f9-p4bjm\" (UID: \"60385688-48aa-4671-9854-60eb4e36f072\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-p4bjm" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.458005 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-g6nkz" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.460570 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9676\" (UniqueName: \"kubernetes.io/projected/d6cbfd2b-61bd-433e-95e8-8351340d720f-kube-api-access-s9676\") pod \"rabbitmq-cluster-operator-manager-668c99d594-jrbt9\" (UID: \"d6cbfd2b-61bd-433e-95e8-8351340d720f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jrbt9" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.469298 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-p4bjm" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.478886 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9676\" (UniqueName: \"kubernetes.io/projected/d6cbfd2b-61bd-433e-95e8-8351340d720f-kube-api-access-s9676\") pod \"rabbitmq-cluster-operator-manager-668c99d594-jrbt9\" (UID: \"d6cbfd2b-61bd-433e-95e8-8351340d720f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jrbt9" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.520008 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jrbt9" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.554182 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-d4wwp"] Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.561469 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ccc7b81-b1de-48e8-aec9-f88f615ebf88-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-4fwmj\" (UID: \"7ccc7b81-b1de-48e8-aec9-f88f615ebf88\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4fwmj" Mar 19 15:33:38 crc kubenswrapper[4771]: E0319 15:33:38.561606 4771 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 15:33:38 crc kubenswrapper[4771]: E0319 15:33:38.561651 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ccc7b81-b1de-48e8-aec9-f88f615ebf88-cert podName:7ccc7b81-b1de-48e8-aec9-f88f615ebf88 nodeName:}" failed. No retries permitted until 2026-03-19 15:33:39.561639648 +0000 UTC m=+1078.790260850 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7ccc7b81-b1de-48e8-aec9-f88f615ebf88-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-4fwmj" (UID: "7ccc7b81-b1de-48e8-aec9-f88f615ebf88") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.740078 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-d4wwp" event={"ID":"297266bf-7ed9-43bf-abfa-d608acf96290","Type":"ContainerStarted","Data":"83660f199429b34dfc3ee3c4aafaf150d6b2c55393618e73563904fd09c1bb74"} Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.741389 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-x9hpx" event={"ID":"4717b1db-fd1d-4e9f-b04d-b88488b35369","Type":"ContainerStarted","Data":"fbc9b55508a4deb704fd2418570c72948b3b730bf372f980ca931d6059dc1591"} Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.858094 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-hgvwc"] Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.863832 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-5xksc"] Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.864428 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-webhook-certs\") pod \"openstack-operator-controller-manager-75b679d9b8-4nqx6\" (UID: \"3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d\") " pod="openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6" Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.864485 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-metrics-certs\") pod \"openstack-operator-controller-manager-75b679d9b8-4nqx6\" (UID: \"3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d\") " pod="openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6" Mar 19 15:33:38 crc kubenswrapper[4771]: E0319 15:33:38.864690 4771 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 15:33:38 crc kubenswrapper[4771]: E0319 15:33:38.864739 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-metrics-certs podName:3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d nodeName:}" failed. No retries permitted until 2026-03-19 15:33:39.864726353 +0000 UTC m=+1079.093347555 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-metrics-certs") pod "openstack-operator-controller-manager-75b679d9b8-4nqx6" (UID: "3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d") : secret "metrics-server-cert" not found Mar 19 15:33:38 crc kubenswrapper[4771]: E0319 15:33:38.865035 4771 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 15:33:38 crc kubenswrapper[4771]: E0319 15:33:38.865063 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-webhook-certs podName:3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d nodeName:}" failed. No retries permitted until 2026-03-19 15:33:39.865055301 +0000 UTC m=+1079.093676503 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-webhook-certs") pod "openstack-operator-controller-manager-75b679d9b8-4nqx6" (UID: "3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d") : secret "webhook-server-cert" not found Mar 19 15:33:38 crc kubenswrapper[4771]: I0319 15:33:38.874102 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-9mp28"] Mar 19 15:33:38 crc kubenswrapper[4771]: W0319 15:33:38.884799 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d301298_e7fb_4ce7_8369_7d1887b6a913.slice/crio-83e4e906f512c51ba9bdc4ffd79a105fbcbbba2336aab2036662f43716fbca7d WatchSource:0}: Error finding container 83e4e906f512c51ba9bdc4ffd79a105fbcbbba2336aab2036662f43716fbca7d: Status 404 returned error can't find the container with id 83e4e906f512c51ba9bdc4ffd79a105fbcbbba2336aab2036662f43716fbca7d Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.139905 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-5zj7h"] Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.147364 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-xllmw"] Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.153176 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-xhqbn"] Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.190816 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992-cert\") pod \"infra-operator-controller-manager-7b487c85ff-rdr25\" (UID: \"6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992\") " pod="openstack-operators/infra-operator-controller-manager-7b487c85ff-rdr25" Mar 19 15:33:39 crc kubenswrapper[4771]: E0319 15:33:39.191162 4771 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 15:33:39 crc kubenswrapper[4771]: E0319 15:33:39.191233 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992-cert podName:6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992 nodeName:}" failed. No retries permitted until 2026-03-19 15:33:41.191215682 +0000 UTC m=+1080.419836894 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992-cert") pod "infra-operator-controller-manager-7b487c85ff-rdr25" (UID: "6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992") : secret "infra-operator-webhook-server-cert" not found Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.191467 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-mldfm"] Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.231257 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-6s7qk"] Mar 19 15:33:39 crc kubenswrapper[4771]: E0319 15:33:39.236706 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6rblf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-67ccfc9778-w4pll_openstack-operators(6c6e6d57-bc57-4368-9b9f-ce85dbf99b46): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 15:33:39 crc kubenswrapper[4771]: E0319 15:33:39.237607 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4lwd4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-knffk_openstack-operators(3fc80c77-8c96-492a-8da1-4e617cfc2878): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 15:33:39 crc kubenswrapper[4771]: E0319 15:33:39.237954 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-w4pll" podUID="6c6e6d57-bc57-4368-9b9f-ce85dbf99b46" Mar 19 15:33:39 crc kubenswrapper[4771]: E0319 15:33:39.238787 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-knffk" podUID="3fc80c77-8c96-492a-8da1-4e617cfc2878" Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.240932 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-g6nkz"] Mar 19 15:33:39 crc kubenswrapper[4771]: E0319 15:33:39.246817 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xjq6x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-zcdvr_openstack-operators(164abed1-8fb0-4276-acc5-08c87a08ba9a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.247908 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-7p6l9"] Mar 19 15:33:39 crc kubenswrapper[4771]: E0319 15:33:39.247973 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-zcdvr" podUID="164abed1-8fb0-4276-acc5-08c87a08ba9a" Mar 19 15:33:39 crc kubenswrapper[4771]: E0319 15:33:39.251171 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v824j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5784578c99-rbjxs_openstack-operators(ade0d786-88b7-465a-917a-7147ae923a01): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 15:33:39 crc kubenswrapper[4771]: E0319 15:33:39.253181 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-rbjxs" podUID="ade0d786-88b7-465a-917a-7147ae923a01" Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.253236 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-knffk"] Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.257481 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-zcdvr"] Mar 19 15:33:39 crc kubenswrapper[4771]: E0319 15:33:39.262381 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rcrwz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-55f864c847-t7mdc_openstack-operators(3196e589-b895-4c94-aa2b-1b4a1b0786cf): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 15:33:39 crc kubenswrapper[4771]: E0319 15:33:39.263628 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-t7mdc" podUID="3196e589-b895-4c94-aa2b-1b4a1b0786cf" Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.265523 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-p4bjm"] Mar 19 15:33:39 crc kubenswrapper[4771]: E0319 15:33:39.268978 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s9676,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-jrbt9_openstack-operators(d6cbfd2b-61bd-433e-95e8-8351340d720f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 15:33:39 crc kubenswrapper[4771]: E0319 15:33:39.270201 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jrbt9" podUID="d6cbfd2b-61bd-433e-95e8-8351340d720f" Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.271137 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-r479m"] Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.276812 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-w4pll"] Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.282089 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jrbt9"] Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.286477 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-rbjxs"] Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.291324 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-t7mdc"] Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.596140 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ccc7b81-b1de-48e8-aec9-f88f615ebf88-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-4fwmj\" (UID: \"7ccc7b81-b1de-48e8-aec9-f88f615ebf88\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4fwmj" Mar 19 15:33:39 crc kubenswrapper[4771]: E0319 15:33:39.596304 4771 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 15:33:39 crc kubenswrapper[4771]: E0319 15:33:39.596351 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ccc7b81-b1de-48e8-aec9-f88f615ebf88-cert podName:7ccc7b81-b1de-48e8-aec9-f88f615ebf88 nodeName:}" failed. No retries permitted until 2026-03-19 15:33:41.596337832 +0000 UTC m=+1080.824959034 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7ccc7b81-b1de-48e8-aec9-f88f615ebf88-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-4fwmj" (UID: "7ccc7b81-b1de-48e8-aec9-f88f615ebf88") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.751129 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-xhqbn" event={"ID":"14fa11c5-1371-4da1-aa9b-8b7b2463600e","Type":"ContainerStarted","Data":"e1888bf3f039974cb930408c7cddd223506fa10e94c4f8da4ef3a121b88577a7"} Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.752765 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-r479m" event={"ID":"2a2f4027-c0c8-4032-9e40-ab2ce99c899f","Type":"ContainerStarted","Data":"42ea86b29c41b8e6b3c720a0afc9f74ff3a8341e8745b3fd15f9d7b54e0f7ba8"} Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.754260 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-xllmw" event={"ID":"3701ec62-21e3-4bb7-8e32-c09fb4c5d619","Type":"ContainerStarted","Data":"8689632ade3968e9a659637bc1c6ada8432c373fc0e65d6a4658a98882bde5e4"} Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.755848 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-t7mdc" event={"ID":"3196e589-b895-4c94-aa2b-1b4a1b0786cf","Type":"ContainerStarted","Data":"f5785a4300939e28ed0e21ce3f335372bc78b6281da9781f612ef9c64f991566"} Mar 19 15:33:39 crc kubenswrapper[4771]: E0319 15:33:39.758656 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da\\\"\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-t7mdc" podUID="3196e589-b895-4c94-aa2b-1b4a1b0786cf" Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.759085 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-g6nkz" event={"ID":"6d221d10-3c53-4137-88fc-8905e46b397c","Type":"ContainerStarted","Data":"de64b6698d7cf4db0aa397921fdaad544db072f2cdf07910ebb945451bfcc396"} Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.760311 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-rbjxs" event={"ID":"ade0d786-88b7-465a-917a-7147ae923a01","Type":"ContainerStarted","Data":"0588fcbf8056393b9f30ec8799f629e71d4ecc59d907651601ad88d5c284c633"} Mar 19 15:33:39 crc kubenswrapper[4771]: E0319 15:33:39.761619 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-rbjxs" podUID="ade0d786-88b7-465a-917a-7147ae923a01" Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.762187 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-zcdvr" event={"ID":"164abed1-8fb0-4276-acc5-08c87a08ba9a","Type":"ContainerStarted","Data":"2706e7e2080e7ba765327605c0245de49ff8e4c0ee462dcf9bdaf243915fda61"} Mar 19 15:33:39 crc kubenswrapper[4771]: E0319 15:33:39.764815 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-zcdvr" podUID="164abed1-8fb0-4276-acc5-08c87a08ba9a" Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.765004 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-p4bjm" event={"ID":"60385688-48aa-4671-9854-60eb4e36f072","Type":"ContainerStarted","Data":"a8adb67b1bd4e6ad63ec5f8744f26eba01f8714c43259c5db7cc6bde784d45d3"} Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.766337 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-9mp28" event={"ID":"8d301298-e7fb-4ce7-8369-7d1887b6a913","Type":"ContainerStarted","Data":"83e4e906f512c51ba9bdc4ffd79a105fbcbbba2336aab2036662f43716fbca7d"} Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.778626 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-7p6l9" event={"ID":"f10d9b25-6f62-4300-9827-bebe80433dda","Type":"ContainerStarted","Data":"b8dc226e3b025d86d310cb21216d2b2919a42f091a7c20becd4de8b6c69aaa6d"} Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.780586 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hgvwc" event={"ID":"bb45e644-93ea-41f8-96b5-bf1765f44488","Type":"ContainerStarted","Data":"67484f4ddede2dfcc6a3644010d3805c4fbdf6ca9eb040dfc133997e6b9ed428"} Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.782177 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-mldfm" event={"ID":"8caf5d5b-cffa-4b03-b9b9-7bd54217fda6","Type":"ContainerStarted","Data":"d069d28aa9e391338948412afa1e1926be7865aeab7afdcce79c6b8333ff891c"} Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.800043 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-knffk" event={"ID":"3fc80c77-8c96-492a-8da1-4e617cfc2878","Type":"ContainerStarted","Data":"45895fd6167aa3ab825794f477d5f3d81cc5e1ebb2301c861018f92b15abe913"} Mar 19 15:33:39 crc kubenswrapper[4771]: E0319 15:33:39.802883 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-knffk" podUID="3fc80c77-8c96-492a-8da1-4e617cfc2878" Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.803564 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5zj7h" event={"ID":"051674d9-53cb-4cbc-ae54-b6beb16456ee","Type":"ContainerStarted","Data":"c249f982c935dac61882f8db0ee8c5cda28b98f5d9d95f3b71caf6e679875642"} Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.804959 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jrbt9" event={"ID":"d6cbfd2b-61bd-433e-95e8-8351340d720f","Type":"ContainerStarted","Data":"1c818274edeaebcd1f9229f0186c617e9d73f4f55ce4302ade2035cde80402d7"} Mar 19 15:33:39 crc kubenswrapper[4771]: E0319 15:33:39.806022 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jrbt9" podUID="d6cbfd2b-61bd-433e-95e8-8351340d720f" Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.820609 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-5xksc" event={"ID":"682bc21b-ae46-487c-b1f8-a8626914fff4","Type":"ContainerStarted","Data":"6e69adff71c466cf08c4fda89eb3be5474e3f1a0b673fcbcf2d3d8290fb86fa4"} Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.825394 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-w4pll" event={"ID":"6c6e6d57-bc57-4368-9b9f-ce85dbf99b46","Type":"ContainerStarted","Data":"ebe4d1f7ba1cc570b2007d9a74e847309f574300e6cc8b03dd295ce0b348e968"} Mar 19 15:33:39 crc kubenswrapper[4771]: E0319 15:33:39.826896 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-w4pll" podUID="6c6e6d57-bc57-4368-9b9f-ce85dbf99b46" Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.828382 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-6s7qk" event={"ID":"0200604c-cbeb-45ea-9f92-b5f857d05b23","Type":"ContainerStarted","Data":"428e4cdc5d89e68a6cc9458ac749a3013a77f24038abc65087cab0039efc24a0"} Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.899809 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-webhook-certs\") pod \"openstack-operator-controller-manager-75b679d9b8-4nqx6\" (UID: \"3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d\") " pod="openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6" Mar 19 15:33:39 crc kubenswrapper[4771]: I0319 15:33:39.899883 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-metrics-certs\") pod \"openstack-operator-controller-manager-75b679d9b8-4nqx6\" (UID: \"3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d\") " pod="openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6" Mar 19 15:33:39 crc kubenswrapper[4771]: E0319 15:33:39.900031 4771 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 15:33:39 crc kubenswrapper[4771]: E0319 15:33:39.900076 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-metrics-certs podName:3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d nodeName:}" failed. No retries permitted until 2026-03-19 15:33:41.900062842 +0000 UTC m=+1081.128684044 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-metrics-certs") pod "openstack-operator-controller-manager-75b679d9b8-4nqx6" (UID: "3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d") : secret "metrics-server-cert" not found Mar 19 15:33:39 crc kubenswrapper[4771]: E0319 15:33:39.900575 4771 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 15:33:39 crc kubenswrapper[4771]: E0319 15:33:39.900667 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-webhook-certs podName:3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d nodeName:}" failed. No retries permitted until 2026-03-19 15:33:41.900643436 +0000 UTC m=+1081.129264688 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-webhook-certs") pod "openstack-operator-controller-manager-75b679d9b8-4nqx6" (UID: "3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d") : secret "webhook-server-cert" not found Mar 19 15:33:40 crc kubenswrapper[4771]: E0319 15:33:40.838179 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da\\\"\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-t7mdc" podUID="3196e589-b895-4c94-aa2b-1b4a1b0786cf" Mar 19 15:33:40 crc kubenswrapper[4771]: E0319 15:33:40.839939 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-knffk" podUID="3fc80c77-8c96-492a-8da1-4e617cfc2878" Mar 19 15:33:40 crc kubenswrapper[4771]: E0319 15:33:40.840337 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-rbjxs" podUID="ade0d786-88b7-465a-917a-7147ae923a01" Mar 19 15:33:40 crc kubenswrapper[4771]: E0319 15:33:40.840394 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-zcdvr" podUID="164abed1-8fb0-4276-acc5-08c87a08ba9a" Mar 19 15:33:40 crc kubenswrapper[4771]: E0319 15:33:40.840440 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jrbt9" podUID="d6cbfd2b-61bd-433e-95e8-8351340d720f" Mar 19 15:33:40 crc kubenswrapper[4771]: E0319 15:33:40.840485 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-w4pll" podUID="6c6e6d57-bc57-4368-9b9f-ce85dbf99b46" Mar 19 15:33:41 crc kubenswrapper[4771]: I0319 15:33:41.223843 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992-cert\") pod \"infra-operator-controller-manager-7b487c85ff-rdr25\" (UID: \"6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992\") " pod="openstack-operators/infra-operator-controller-manager-7b487c85ff-rdr25" Mar 19 15:33:41 crc kubenswrapper[4771]: E0319 15:33:41.224240 4771 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 15:33:41 crc kubenswrapper[4771]: E0319 15:33:41.224744 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992-cert podName:6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992 nodeName:}" failed. No retries permitted until 2026-03-19 15:33:45.224713795 +0000 UTC m=+1084.453334987 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992-cert") pod "infra-operator-controller-manager-7b487c85ff-rdr25" (UID: "6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992") : secret "infra-operator-webhook-server-cert" not found Mar 19 15:33:41 crc kubenswrapper[4771]: I0319 15:33:41.630529 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ccc7b81-b1de-48e8-aec9-f88f615ebf88-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-4fwmj\" (UID: \"7ccc7b81-b1de-48e8-aec9-f88f615ebf88\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4fwmj" Mar 19 15:33:41 crc kubenswrapper[4771]: E0319 15:33:41.630746 4771 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 15:33:41 crc kubenswrapper[4771]: E0319 15:33:41.630798 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ccc7b81-b1de-48e8-aec9-f88f615ebf88-cert podName:7ccc7b81-b1de-48e8-aec9-f88f615ebf88 nodeName:}" failed. No retries permitted until 2026-03-19 15:33:45.630780978 +0000 UTC m=+1084.859402180 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7ccc7b81-b1de-48e8-aec9-f88f615ebf88-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-4fwmj" (UID: "7ccc7b81-b1de-48e8-aec9-f88f615ebf88") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 15:33:41 crc kubenswrapper[4771]: I0319 15:33:41.934242 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-webhook-certs\") pod \"openstack-operator-controller-manager-75b679d9b8-4nqx6\" (UID: \"3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d\") " pod="openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6" Mar 19 15:33:41 crc kubenswrapper[4771]: I0319 15:33:41.934307 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-metrics-certs\") pod \"openstack-operator-controller-manager-75b679d9b8-4nqx6\" (UID: \"3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d\") " pod="openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6" Mar 19 15:33:41 crc kubenswrapper[4771]: E0319 15:33:41.934393 4771 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 15:33:41 crc kubenswrapper[4771]: E0319 15:33:41.934433 4771 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 15:33:41 crc kubenswrapper[4771]: E0319 15:33:41.934442 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-webhook-certs podName:3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d nodeName:}" failed. No retries permitted until 2026-03-19 15:33:45.934428627 +0000 UTC m=+1085.163049829 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-webhook-certs") pod "openstack-operator-controller-manager-75b679d9b8-4nqx6" (UID: "3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d") : secret "webhook-server-cert" not found Mar 19 15:33:41 crc kubenswrapper[4771]: E0319 15:33:41.934460 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-metrics-certs podName:3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d nodeName:}" failed. No retries permitted until 2026-03-19 15:33:45.934450907 +0000 UTC m=+1085.163072109 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-metrics-certs") pod "openstack-operator-controller-manager-75b679d9b8-4nqx6" (UID: "3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d") : secret "metrics-server-cert" not found Mar 19 15:33:42 crc kubenswrapper[4771]: I0319 15:33:42.517669 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r5frl"] Mar 19 15:33:42 crc kubenswrapper[4771]: I0319 15:33:42.519466 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r5frl" Mar 19 15:33:42 crc kubenswrapper[4771]: I0319 15:33:42.532175 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r5frl"] Mar 19 15:33:42 crc kubenswrapper[4771]: I0319 15:33:42.646545 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/003252d4-823f-4462-94cf-77521e666cec-catalog-content\") pod \"certified-operators-r5frl\" (UID: \"003252d4-823f-4462-94cf-77521e666cec\") " pod="openshift-marketplace/certified-operators-r5frl" Mar 19 15:33:42 crc kubenswrapper[4771]: I0319 15:33:42.646803 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/003252d4-823f-4462-94cf-77521e666cec-utilities\") pod \"certified-operators-r5frl\" (UID: \"003252d4-823f-4462-94cf-77521e666cec\") " pod="openshift-marketplace/certified-operators-r5frl" Mar 19 15:33:42 crc kubenswrapper[4771]: I0319 15:33:42.647152 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgxrh\" (UniqueName: \"kubernetes.io/projected/003252d4-823f-4462-94cf-77521e666cec-kube-api-access-cgxrh\") pod \"certified-operators-r5frl\" (UID: \"003252d4-823f-4462-94cf-77521e666cec\") " pod="openshift-marketplace/certified-operators-r5frl" Mar 19 15:33:42 crc kubenswrapper[4771]: I0319 15:33:42.749866 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/003252d4-823f-4462-94cf-77521e666cec-catalog-content\") pod \"certified-operators-r5frl\" (UID: \"003252d4-823f-4462-94cf-77521e666cec\") " pod="openshift-marketplace/certified-operators-r5frl" Mar 19 15:33:42 crc kubenswrapper[4771]: I0319 15:33:42.750004 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/003252d4-823f-4462-94cf-77521e666cec-utilities\") pod \"certified-operators-r5frl\" (UID: \"003252d4-823f-4462-94cf-77521e666cec\") " pod="openshift-marketplace/certified-operators-r5frl" Mar 19 15:33:42 crc kubenswrapper[4771]: I0319 15:33:42.750074 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgxrh\" (UniqueName: \"kubernetes.io/projected/003252d4-823f-4462-94cf-77521e666cec-kube-api-access-cgxrh\") pod \"certified-operators-r5frl\" (UID: \"003252d4-823f-4462-94cf-77521e666cec\") " pod="openshift-marketplace/certified-operators-r5frl" Mar 19 15:33:42 crc kubenswrapper[4771]: I0319 15:33:42.750502 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/003252d4-823f-4462-94cf-77521e666cec-catalog-content\") pod \"certified-operators-r5frl\" (UID: \"003252d4-823f-4462-94cf-77521e666cec\") " pod="openshift-marketplace/certified-operators-r5frl" Mar 19 15:33:42 crc kubenswrapper[4771]: I0319 15:33:42.750551 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/003252d4-823f-4462-94cf-77521e666cec-utilities\") pod \"certified-operators-r5frl\" (UID: \"003252d4-823f-4462-94cf-77521e666cec\") " pod="openshift-marketplace/certified-operators-r5frl" Mar 19 15:33:42 crc kubenswrapper[4771]: I0319 15:33:42.779526 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgxrh\" (UniqueName: \"kubernetes.io/projected/003252d4-823f-4462-94cf-77521e666cec-kube-api-access-cgxrh\") pod \"certified-operators-r5frl\" (UID: \"003252d4-823f-4462-94cf-77521e666cec\") " pod="openshift-marketplace/certified-operators-r5frl" Mar 19 15:33:42 crc kubenswrapper[4771]: I0319 15:33:42.849331 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r5frl" Mar 19 15:33:45 crc kubenswrapper[4771]: I0319 15:33:45.286082 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992-cert\") pod \"infra-operator-controller-manager-7b487c85ff-rdr25\" (UID: \"6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992\") " pod="openstack-operators/infra-operator-controller-manager-7b487c85ff-rdr25" Mar 19 15:33:45 crc kubenswrapper[4771]: E0319 15:33:45.286307 4771 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 15:33:45 crc kubenswrapper[4771]: E0319 15:33:45.286581 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992-cert podName:6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992 nodeName:}" failed. No retries permitted until 2026-03-19 15:33:53.286553057 +0000 UTC m=+1092.515174289 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992-cert") pod "infra-operator-controller-manager-7b487c85ff-rdr25" (UID: "6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992") : secret "infra-operator-webhook-server-cert" not found Mar 19 15:33:45 crc kubenswrapper[4771]: I0319 15:33:45.694642 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ccc7b81-b1de-48e8-aec9-f88f615ebf88-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-4fwmj\" (UID: \"7ccc7b81-b1de-48e8-aec9-f88f615ebf88\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4fwmj" Mar 19 15:33:45 crc kubenswrapper[4771]: E0319 15:33:45.694882 4771 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 15:33:45 crc kubenswrapper[4771]: E0319 15:33:45.695037 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ccc7b81-b1de-48e8-aec9-f88f615ebf88-cert podName:7ccc7b81-b1de-48e8-aec9-f88f615ebf88 nodeName:}" failed. No retries permitted until 2026-03-19 15:33:53.695013299 +0000 UTC m=+1092.923634501 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7ccc7b81-b1de-48e8-aec9-f88f615ebf88-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-4fwmj" (UID: "7ccc7b81-b1de-48e8-aec9-f88f615ebf88") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 15:33:45 crc kubenswrapper[4771]: I0319 15:33:45.998370 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-metrics-certs\") pod \"openstack-operator-controller-manager-75b679d9b8-4nqx6\" (UID: \"3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d\") " pod="openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6" Mar 19 15:33:45 crc kubenswrapper[4771]: E0319 15:33:45.998485 4771 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 15:33:45 crc kubenswrapper[4771]: I0319 15:33:45.998525 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-webhook-certs\") pod \"openstack-operator-controller-manager-75b679d9b8-4nqx6\" (UID: \"3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d\") " pod="openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6" Mar 19 15:33:45 crc kubenswrapper[4771]: E0319 15:33:45.998538 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-metrics-certs podName:3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d nodeName:}" failed. No retries permitted until 2026-03-19 15:33:53.998523513 +0000 UTC m=+1093.227144715 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-metrics-certs") pod "openstack-operator-controller-manager-75b679d9b8-4nqx6" (UID: "3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d") : secret "metrics-server-cert" not found Mar 19 15:33:45 crc kubenswrapper[4771]: E0319 15:33:45.998639 4771 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 15:33:45 crc kubenswrapper[4771]: E0319 15:33:45.998682 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-webhook-certs podName:3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d nodeName:}" failed. No retries permitted until 2026-03-19 15:33:53.998669628 +0000 UTC m=+1093.227290830 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-webhook-certs") pod "openstack-operator-controller-manager-75b679d9b8-4nqx6" (UID: "3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d") : secret "webhook-server-cert" not found Mar 19 15:33:53 crc kubenswrapper[4771]: I0319 15:33:53.308331 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992-cert\") pod \"infra-operator-controller-manager-7b487c85ff-rdr25\" (UID: \"6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992\") " pod="openstack-operators/infra-operator-controller-manager-7b487c85ff-rdr25" Mar 19 15:33:53 crc kubenswrapper[4771]: E0319 15:33:53.308580 4771 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 15:33:53 crc kubenswrapper[4771]: E0319 15:33:53.308705 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992-cert podName:6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992 nodeName:}" failed. No retries permitted until 2026-03-19 15:34:09.308675528 +0000 UTC m=+1108.537296760 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992-cert") pod "infra-operator-controller-manager-7b487c85ff-rdr25" (UID: "6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992") : secret "infra-operator-webhook-server-cert" not found Mar 19 15:33:53 crc kubenswrapper[4771]: I0319 15:33:53.714619 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ccc7b81-b1de-48e8-aec9-f88f615ebf88-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-4fwmj\" (UID: \"7ccc7b81-b1de-48e8-aec9-f88f615ebf88\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4fwmj" Mar 19 15:33:53 crc kubenswrapper[4771]: E0319 15:33:53.714793 4771 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 15:33:53 crc kubenswrapper[4771]: E0319 15:33:53.714856 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ccc7b81-b1de-48e8-aec9-f88f615ebf88-cert podName:7ccc7b81-b1de-48e8-aec9-f88f615ebf88 nodeName:}" failed. No retries permitted until 2026-03-19 15:34:09.714840695 +0000 UTC m=+1108.943461897 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7ccc7b81-b1de-48e8-aec9-f88f615ebf88-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-4fwmj" (UID: "7ccc7b81-b1de-48e8-aec9-f88f615ebf88") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 15:33:54 crc kubenswrapper[4771]: I0319 15:33:54.060320 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-webhook-certs\") pod \"openstack-operator-controller-manager-75b679d9b8-4nqx6\" (UID: \"3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d\") " pod="openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6" Mar 19 15:33:54 crc kubenswrapper[4771]: I0319 15:33:54.060442 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-metrics-certs\") pod \"openstack-operator-controller-manager-75b679d9b8-4nqx6\" (UID: \"3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d\") " pod="openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6" Mar 19 15:33:54 crc kubenswrapper[4771]: E0319 15:33:54.060632 4771 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 15:33:54 crc kubenswrapper[4771]: E0319 15:33:54.060742 4771 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 15:33:54 crc kubenswrapper[4771]: E0319 15:33:54.060775 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-webhook-certs podName:3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d nodeName:}" failed. No retries permitted until 2026-03-19 15:34:10.06074025 +0000 UTC m=+1109.289361492 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-webhook-certs") pod "openstack-operator-controller-manager-75b679d9b8-4nqx6" (UID: "3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d") : secret "webhook-server-cert" not found Mar 19 15:33:54 crc kubenswrapper[4771]: E0319 15:33:54.060850 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-metrics-certs podName:3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d nodeName:}" failed. No retries permitted until 2026-03-19 15:34:10.060820622 +0000 UTC m=+1109.289441884 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-metrics-certs") pod "openstack-operator-controller-manager-75b679d9b8-4nqx6" (UID: "3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d") : secret "metrics-server-cert" not found Mar 19 15:33:54 crc kubenswrapper[4771]: E0319 15:33:54.489836 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900" Mar 19 15:33:54 crc kubenswrapper[4771]: E0319 15:33:54.490596 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mhxsd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-67dd5f86f5-7p6l9_openstack-operators(f10d9b25-6f62-4300-9827-bebe80433dda): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 15:33:54 crc kubenswrapper[4771]: E0319 15:33:54.491878 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-7p6l9" podUID="f10d9b25-6f62-4300-9827-bebe80433dda" Mar 19 15:33:54 crc kubenswrapper[4771]: E0319 15:33:54.942818 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900\\\"\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-7p6l9" podUID="f10d9b25-6f62-4300-9827-bebe80433dda" Mar 19 15:33:55 crc kubenswrapper[4771]: E0319 15:33:55.500937 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a" Mar 19 15:33:55 crc kubenswrapper[4771]: E0319 15:33:55.501191 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zdbt4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5b9f45d989-xhqbn_openstack-operators(14fa11c5-1371-4da1-aa9b-8b7b2463600e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 15:33:55 crc kubenswrapper[4771]: E0319 15:33:55.502898 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-xhqbn" podUID="14fa11c5-1371-4da1-aa9b-8b7b2463600e" Mar 19 15:33:55 crc kubenswrapper[4771]: E0319 15:33:55.948264 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-xhqbn" podUID="14fa11c5-1371-4da1-aa9b-8b7b2463600e" Mar 19 15:33:56 crc kubenswrapper[4771]: E0319 15:33:56.144667 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e" Mar 19 15:33:56 crc kubenswrapper[4771]: E0319 15:33:56.144821 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-449dp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-r479m_openstack-operators(2a2f4027-c0c8-4032-9e40-ab2ce99c899f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 15:33:56 crc kubenswrapper[4771]: E0319 15:33:56.146088 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-r479m" podUID="2a2f4027-c0c8-4032-9e40-ab2ce99c899f" Mar 19 15:33:56 crc kubenswrapper[4771]: E0319 15:33:56.701696 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a" Mar 19 15:33:56 crc kubenswrapper[4771]: E0319 15:33:56.702137 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s5gl7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-5zj7h_openstack-operators(051674d9-53cb-4cbc-ae54-b6beb16456ee): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 15:33:56 crc kubenswrapper[4771]: E0319 15:33:56.703902 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5zj7h" podUID="051674d9-53cb-4cbc-ae54-b6beb16456ee" Mar 19 15:33:56 crc kubenswrapper[4771]: E0319 15:33:56.954051 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-r479m" podUID="2a2f4027-c0c8-4032-9e40-ab2ce99c899f" Mar 19 15:33:56 crc kubenswrapper[4771]: E0319 15:33:56.954303 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5zj7h" podUID="051674d9-53cb-4cbc-ae54-b6beb16456ee" Mar 19 15:34:00 crc kubenswrapper[4771]: I0319 15:34:00.185764 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565574-p6k29"] Mar 19 15:34:00 crc kubenswrapper[4771]: I0319 15:34:00.187126 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565574-p6k29" Mar 19 15:34:00 crc kubenswrapper[4771]: I0319 15:34:00.191503 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 15:34:00 crc kubenswrapper[4771]: I0319 15:34:00.191716 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 15:34:00 crc kubenswrapper[4771]: I0319 15:34:00.191778 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565574-p6k29"] Mar 19 15:34:00 crc kubenswrapper[4771]: I0319 15:34:00.191836 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k42k7" Mar 19 15:34:00 crc kubenswrapper[4771]: I0319 15:34:00.364356 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkzd8\" (UniqueName: \"kubernetes.io/projected/ed7bd159-532b-4d06-9079-4a04e5a1c3a4-kube-api-access-qkzd8\") pod \"auto-csr-approver-29565574-p6k29\" (UID: \"ed7bd159-532b-4d06-9079-4a04e5a1c3a4\") " pod="openshift-infra/auto-csr-approver-29565574-p6k29" Mar 19 15:34:00 crc kubenswrapper[4771]: I0319 15:34:00.465620 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkzd8\" (UniqueName: \"kubernetes.io/projected/ed7bd159-532b-4d06-9079-4a04e5a1c3a4-kube-api-access-qkzd8\") pod \"auto-csr-approver-29565574-p6k29\" (UID: \"ed7bd159-532b-4d06-9079-4a04e5a1c3a4\") " pod="openshift-infra/auto-csr-approver-29565574-p6k29" Mar 19 15:34:00 crc kubenswrapper[4771]: I0319 15:34:00.484834 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkzd8\" (UniqueName: \"kubernetes.io/projected/ed7bd159-532b-4d06-9079-4a04e5a1c3a4-kube-api-access-qkzd8\") pod \"auto-csr-approver-29565574-p6k29\" (UID: \"ed7bd159-532b-4d06-9079-4a04e5a1c3a4\") " pod="openshift-infra/auto-csr-approver-29565574-p6k29" Mar 19 15:34:00 crc kubenswrapper[4771]: I0319 15:34:00.510805 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565574-p6k29" Mar 19 15:34:01 crc kubenswrapper[4771]: I0319 15:34:01.081296 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r5frl"] Mar 19 15:34:01 crc kubenswrapper[4771]: I0319 15:34:01.984829 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r5frl" event={"ID":"003252d4-823f-4462-94cf-77521e666cec","Type":"ContainerStarted","Data":"4ce35f0ee8dcaa35a4fedd670ae0f4c653d880b9323c49bb7bcbe190156586e6"} Mar 19 15:34:02 crc kubenswrapper[4771]: I0319 15:34:02.641898 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565574-p6k29"] Mar 19 15:34:02 crc kubenswrapper[4771]: W0319 15:34:02.691932 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded7bd159_532b_4d06_9079_4a04e5a1c3a4.slice/crio-5da1504a7e3ae8c02ae7d15754ec26eeff326a997405b77c41c42791aaa9320d WatchSource:0}: Error finding container 5da1504a7e3ae8c02ae7d15754ec26eeff326a997405b77c41c42791aaa9320d: Status 404 returned error can't find the container with id 5da1504a7e3ae8c02ae7d15754ec26eeff326a997405b77c41c42791aaa9320d Mar 19 15:34:02 crc kubenswrapper[4771]: I0319 15:34:02.992376 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-g6nkz" event={"ID":"6d221d10-3c53-4137-88fc-8905e46b397c","Type":"ContainerStarted","Data":"4bac65bf6ccfd41cc01395b8dfce2e9727e8e8b95501e93be94bb2c2420d8caa"} Mar 19 15:34:02 crc kubenswrapper[4771]: I0319 15:34:02.992726 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-g6nkz" Mar 19 15:34:03 crc kubenswrapper[4771]: I0319 15:34:03.001135 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-5xksc" event={"ID":"682bc21b-ae46-487c-b1f8-a8626914fff4","Type":"ContainerStarted","Data":"20638d83b70cb81edffc0e0e05043aba146d7687b7e076096f39e5548e3b3824"} Mar 19 15:34:03 crc kubenswrapper[4771]: I0319 15:34:03.001776 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-5xksc" Mar 19 15:34:03 crc kubenswrapper[4771]: I0319 15:34:03.014908 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-d4wwp" event={"ID":"297266bf-7ed9-43bf-abfa-d608acf96290","Type":"ContainerStarted","Data":"bd2951b9042318f128371e6c301e86855a799931362e151ef5fedf9b9584f822"} Mar 19 15:34:03 crc kubenswrapper[4771]: I0319 15:34:03.015200 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-d4wwp" Mar 19 15:34:03 crc kubenswrapper[4771]: I0319 15:34:03.020107 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hgvwc" event={"ID":"bb45e644-93ea-41f8-96b5-bf1765f44488","Type":"ContainerStarted","Data":"4b85b1765d5dc31b6a40bfb09500ae7c32fca4d8a8fe9b40a4939b66710eb4ab"} Mar 19 15:34:03 crc kubenswrapper[4771]: I0319 15:34:03.020274 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hgvwc" Mar 19 15:34:03 crc kubenswrapper[4771]: I0319 15:34:03.058849 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-g6nkz" podStartSLOduration=6.194866933 podStartE2EDuration="26.058830091s" podCreationTimestamp="2026-03-19 15:33:37 +0000 UTC" firstStartedPulling="2026-03-19 15:33:39.217791221 +0000 UTC m=+1078.446412423" lastFinishedPulling="2026-03-19 15:33:59.081754379 +0000 UTC m=+1098.310375581" observedRunningTime="2026-03-19 15:34:03.031352961 +0000 UTC m=+1102.259974163" watchObservedRunningTime="2026-03-19 15:34:03.058830091 +0000 UTC m=+1102.287451293" Mar 19 15:34:03 crc kubenswrapper[4771]: I0319 15:34:03.059736 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hgvwc" podStartSLOduration=8.251168086 podStartE2EDuration="26.059730383s" podCreationTimestamp="2026-03-19 15:33:37 +0000 UTC" firstStartedPulling="2026-03-19 15:33:38.866283551 +0000 UTC m=+1078.094904753" lastFinishedPulling="2026-03-19 15:33:56.674845848 +0000 UTC m=+1095.903467050" observedRunningTime="2026-03-19 15:34:03.056214468 +0000 UTC m=+1102.284835670" watchObservedRunningTime="2026-03-19 15:34:03.059730383 +0000 UTC m=+1102.288351585" Mar 19 15:34:03 crc kubenswrapper[4771]: I0319 15:34:03.064177 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-x9hpx" event={"ID":"4717b1db-fd1d-4e9f-b04d-b88488b35369","Type":"ContainerStarted","Data":"9e6ed8bec3dda87f71a5befd7af9192dde73ff1bd229a5547c7e6ac1cb6f7e2c"} Mar 19 15:34:03 crc kubenswrapper[4771]: I0319 15:34:03.064751 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-x9hpx" Mar 19 15:34:03 crc kubenswrapper[4771]: I0319 15:34:03.079912 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-xllmw" event={"ID":"3701ec62-21e3-4bb7-8e32-c09fb4c5d619","Type":"ContainerStarted","Data":"ae87ad0992fcc960a471d689ed3964e38015b2a11b3254751ea8ab1f04a715aa"} Mar 19 15:34:03 crc kubenswrapper[4771]: I0319 15:34:03.080044 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-xllmw" Mar 19 15:34:03 crc kubenswrapper[4771]: I0319 15:34:03.083518 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-6s7qk" event={"ID":"0200604c-cbeb-45ea-9f92-b5f857d05b23","Type":"ContainerStarted","Data":"ede507d656ca3928bdd652304b09693614e6a55adb77d863e1bf7cfa0e2d9f83"} Mar 19 15:34:03 crc kubenswrapper[4771]: I0319 15:34:03.083799 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-6s7qk" Mar 19 15:34:03 crc kubenswrapper[4771]: I0319 15:34:03.084715 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565574-p6k29" event={"ID":"ed7bd159-532b-4d06-9079-4a04e5a1c3a4","Type":"ContainerStarted","Data":"5da1504a7e3ae8c02ae7d15754ec26eeff326a997405b77c41c42791aaa9320d"} Mar 19 15:34:03 crc kubenswrapper[4771]: I0319 15:34:03.086083 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r5frl" event={"ID":"003252d4-823f-4462-94cf-77521e666cec","Type":"ContainerStarted","Data":"46d835e3af7b8c415a2bf4b35772e51db7be38eb163a26035a1bbf4eac04f553"} Mar 19 15:34:03 crc kubenswrapper[4771]: I0319 15:34:03.091030 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-d4wwp" podStartSLOduration=8.536993706 podStartE2EDuration="26.091011716s" podCreationTimestamp="2026-03-19 15:33:37 +0000 UTC" firstStartedPulling="2026-03-19 15:33:38.568561435 +0000 UTC m=+1077.797182637" lastFinishedPulling="2026-03-19 15:33:56.122579445 +0000 UTC m=+1095.351200647" observedRunningTime="2026-03-19 15:34:03.088067865 +0000 UTC m=+1102.316689067" watchObservedRunningTime="2026-03-19 15:34:03.091011716 +0000 UTC m=+1102.319632908" Mar 19 15:34:03 crc kubenswrapper[4771]: I0319 15:34:03.094204 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-p4bjm" event={"ID":"60385688-48aa-4671-9854-60eb4e36f072","Type":"ContainerStarted","Data":"9d4e9f4366d35e6603ff1b453c5b14c04a958ff0af1bde177222c89a9edd898d"} Mar 19 15:34:03 crc kubenswrapper[4771]: I0319 15:34:03.094342 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-p4bjm" Mar 19 15:34:03 crc kubenswrapper[4771]: I0319 15:34:03.112872 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-5xksc" podStartSLOduration=7.814972158 podStartE2EDuration="26.112857062s" podCreationTimestamp="2026-03-19 15:33:37 +0000 UTC" firstStartedPulling="2026-03-19 15:33:38.884874618 +0000 UTC m=+1078.113495820" lastFinishedPulling="2026-03-19 15:33:57.182759482 +0000 UTC m=+1096.411380724" observedRunningTime="2026-03-19 15:34:03.107629156 +0000 UTC m=+1102.336250358" watchObservedRunningTime="2026-03-19 15:34:03.112857062 +0000 UTC m=+1102.341478264" Mar 19 15:34:03 crc kubenswrapper[4771]: I0319 15:34:03.168998 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-6s7qk" podStartSLOduration=8.712191452999999 podStartE2EDuration="26.168971803s" podCreationTimestamp="2026-03-19 15:33:37 +0000 UTC" firstStartedPulling="2026-03-19 15:33:39.217978066 +0000 UTC m=+1078.446599268" lastFinishedPulling="2026-03-19 15:33:56.674758406 +0000 UTC m=+1095.903379618" observedRunningTime="2026-03-19 15:34:03.16805606 +0000 UTC m=+1102.396677262" watchObservedRunningTime="2026-03-19 15:34:03.168971803 +0000 UTC m=+1102.397593005" Mar 19 15:34:03 crc kubenswrapper[4771]: I0319 15:34:03.227585 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-x9hpx" podStartSLOduration=9.049575802 podStartE2EDuration="26.227570663s" podCreationTimestamp="2026-03-19 15:33:37 +0000 UTC" firstStartedPulling="2026-03-19 15:33:38.311410725 +0000 UTC m=+1077.540031927" lastFinishedPulling="2026-03-19 15:33:55.489405586 +0000 UTC m=+1094.718026788" observedRunningTime="2026-03-19 15:34:03.197410097 +0000 UTC m=+1102.426031289" watchObservedRunningTime="2026-03-19 15:34:03.227570663 +0000 UTC m=+1102.456191865" Mar 19 15:34:03 crc kubenswrapper[4771]: I0319 15:34:03.230737 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-xllmw" podStartSLOduration=8.255010829 podStartE2EDuration="26.230726019s" podCreationTimestamp="2026-03-19 15:33:37 +0000 UTC" firstStartedPulling="2026-03-19 15:33:39.207035602 +0000 UTC m=+1078.435656804" lastFinishedPulling="2026-03-19 15:33:57.182750782 +0000 UTC m=+1096.411371994" observedRunningTime="2026-03-19 15:34:03.226413555 +0000 UTC m=+1102.455034757" watchObservedRunningTime="2026-03-19 15:34:03.230726019 +0000 UTC m=+1102.459347211" Mar 19 15:34:03 crc kubenswrapper[4771]: I0319 15:34:03.243320 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-p4bjm" podStartSLOduration=8.836995826 podStartE2EDuration="26.243301042s" podCreationTimestamp="2026-03-19 15:33:37 +0000 UTC" firstStartedPulling="2026-03-19 15:33:39.267849275 +0000 UTC m=+1078.496470467" lastFinishedPulling="2026-03-19 15:33:56.674154481 +0000 UTC m=+1095.902775683" observedRunningTime="2026-03-19 15:34:03.240223038 +0000 UTC m=+1102.468844240" watchObservedRunningTime="2026-03-19 15:34:03.243301042 +0000 UTC m=+1102.471922244" Mar 19 15:34:04 crc kubenswrapper[4771]: I0319 15:34:04.101472 4771 generic.go:334] "Generic (PLEG): container finished" podID="003252d4-823f-4462-94cf-77521e666cec" containerID="46d835e3af7b8c415a2bf4b35772e51db7be38eb163a26035a1bbf4eac04f553" exitCode=0 Mar 19 15:34:04 crc kubenswrapper[4771]: I0319 15:34:04.101543 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r5frl" event={"ID":"003252d4-823f-4462-94cf-77521e666cec","Type":"ContainerDied","Data":"46d835e3af7b8c415a2bf4b35772e51db7be38eb163a26035a1bbf4eac04f553"} Mar 19 15:34:04 crc kubenswrapper[4771]: I0319 15:34:04.102957 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-mldfm" event={"ID":"8caf5d5b-cffa-4b03-b9b9-7bd54217fda6","Type":"ContainerStarted","Data":"bdda9f9524f1fa1e461e1b8128b3a6dad0dafb74e55310dbde1afcd5e647e465"} Mar 19 15:34:04 crc kubenswrapper[4771]: I0319 15:34:04.103957 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-mldfm" Mar 19 15:34:04 crc kubenswrapper[4771]: I0319 15:34:04.105953 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-rbjxs" event={"ID":"ade0d786-88b7-465a-917a-7147ae923a01","Type":"ContainerStarted","Data":"8ca2b6f80c57a698ec6c4de5cc38c432cd0cf8b7c9c05f5076f2fb896b64eb7c"} Mar 19 15:34:04 crc kubenswrapper[4771]: I0319 15:34:04.106548 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-rbjxs" Mar 19 15:34:04 crc kubenswrapper[4771]: I0319 15:34:04.108188 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jrbt9" event={"ID":"d6cbfd2b-61bd-433e-95e8-8351340d720f","Type":"ContainerStarted","Data":"4817fd28227083bc96dc76a89f48bd49280571554d34568f329181723c3ccc83"} Mar 19 15:34:04 crc kubenswrapper[4771]: I0319 15:34:04.110016 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-9mp28" event={"ID":"8d301298-e7fb-4ce7-8369-7d1887b6a913","Type":"ContainerStarted","Data":"938053a7faa6e709e8118c2b1ca6ab2572af168f757831170fc6606457e9ec02"} Mar 19 15:34:04 crc kubenswrapper[4771]: I0319 15:34:04.110095 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-9mp28" Mar 19 15:34:04 crc kubenswrapper[4771]: I0319 15:34:04.114627 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-t7mdc" event={"ID":"3196e589-b895-4c94-aa2b-1b4a1b0786cf","Type":"ContainerStarted","Data":"8d799287e36c56138330e473fdb229bc61fb4eb59ccf04c9225c5912824caac6"} Mar 19 15:34:04 crc kubenswrapper[4771]: I0319 15:34:04.115241 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-t7mdc" Mar 19 15:34:04 crc kubenswrapper[4771]: I0319 15:34:04.117147 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-w4pll" event={"ID":"6c6e6d57-bc57-4368-9b9f-ce85dbf99b46","Type":"ContainerStarted","Data":"2a139d436a0e0561bb6a837b3df23625f98cc5c5a381e4a869cf24bd70481998"} Mar 19 15:34:04 crc kubenswrapper[4771]: I0319 15:34:04.117592 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-w4pll" Mar 19 15:34:04 crc kubenswrapper[4771]: I0319 15:34:04.120734 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-knffk" event={"ID":"3fc80c77-8c96-492a-8da1-4e617cfc2878","Type":"ContainerStarted","Data":"c6533591d3fcc23b433e4947001292b74c7b56ee4499f5850aac328461d894b8"} Mar 19 15:34:04 crc kubenswrapper[4771]: I0319 15:34:04.121221 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-knffk" Mar 19 15:34:04 crc kubenswrapper[4771]: I0319 15:34:04.122755 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-zcdvr" event={"ID":"164abed1-8fb0-4276-acc5-08c87a08ba9a","Type":"ContainerStarted","Data":"a6b7b90ac798de4ba00abb07b6720980f8321e6ead5f9ac0edfa1a9066ed70d4"} Mar 19 15:34:04 crc kubenswrapper[4771]: I0319 15:34:04.123112 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-zcdvr" Mar 19 15:34:04 crc kubenswrapper[4771]: I0319 15:34:04.178770 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-mldfm" podStartSLOduration=7.317718649 podStartE2EDuration="27.178754737s" podCreationTimestamp="2026-03-19 15:33:37 +0000 UTC" firstStartedPulling="2026-03-19 15:33:39.218809105 +0000 UTC m=+1078.447430307" lastFinishedPulling="2026-03-19 15:33:59.079845193 +0000 UTC m=+1098.308466395" observedRunningTime="2026-03-19 15:34:04.158466969 +0000 UTC m=+1103.387088171" watchObservedRunningTime="2026-03-19 15:34:04.178754737 +0000 UTC m=+1103.407375939" Mar 19 15:34:04 crc kubenswrapper[4771]: I0319 15:34:04.198535 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-9mp28" podStartSLOduration=8.902571624 podStartE2EDuration="27.198520672s" podCreationTimestamp="2026-03-19 15:33:37 +0000 UTC" firstStartedPulling="2026-03-19 15:33:38.887361328 +0000 UTC m=+1078.115982530" lastFinishedPulling="2026-03-19 15:33:57.183310376 +0000 UTC m=+1096.411931578" observedRunningTime="2026-03-19 15:34:04.180978761 +0000 UTC m=+1103.409599963" watchObservedRunningTime="2026-03-19 15:34:04.198520672 +0000 UTC m=+1103.427141864" Mar 19 15:34:04 crc kubenswrapper[4771]: I0319 15:34:04.260381 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-knffk" podStartSLOduration=4.223574108 podStartE2EDuration="27.260364681s" podCreationTimestamp="2026-03-19 15:33:37 +0000 UTC" firstStartedPulling="2026-03-19 15:33:39.237458075 +0000 UTC m=+1078.466079267" lastFinishedPulling="2026-03-19 15:34:02.274248638 +0000 UTC m=+1101.502869840" observedRunningTime="2026-03-19 15:34:04.209685741 +0000 UTC m=+1103.438306943" watchObservedRunningTime="2026-03-19 15:34:04.260364681 +0000 UTC m=+1103.488985883" Mar 19 15:34:04 crc kubenswrapper[4771]: I0319 15:34:04.279660 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-t7mdc" podStartSLOduration=4.392643736 podStartE2EDuration="27.279642205s" podCreationTimestamp="2026-03-19 15:33:37 +0000 UTC" firstStartedPulling="2026-03-19 15:33:39.26222728 +0000 UTC m=+1078.490848482" lastFinishedPulling="2026-03-19 15:34:02.149225749 +0000 UTC m=+1101.377846951" observedRunningTime="2026-03-19 15:34:04.257527602 +0000 UTC m=+1103.486148804" watchObservedRunningTime="2026-03-19 15:34:04.279642205 +0000 UTC m=+1103.508263407" Mar 19 15:34:04 crc kubenswrapper[4771]: I0319 15:34:04.282703 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-rbjxs" podStartSLOduration=4.420482686 podStartE2EDuration="27.282693258s" podCreationTimestamp="2026-03-19 15:33:37 +0000 UTC" firstStartedPulling="2026-03-19 15:33:39.251034791 +0000 UTC m=+1078.479655993" lastFinishedPulling="2026-03-19 15:34:02.113245363 +0000 UTC m=+1101.341866565" observedRunningTime="2026-03-19 15:34:04.27943243 +0000 UTC m=+1103.508053632" watchObservedRunningTime="2026-03-19 15:34:04.282693258 +0000 UTC m=+1103.511314460" Mar 19 15:34:04 crc kubenswrapper[4771]: I0319 15:34:04.296255 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-zcdvr" podStartSLOduration=4.430842426 podStartE2EDuration="27.296241474s" podCreationTimestamp="2026-03-19 15:33:37 +0000 UTC" firstStartedPulling="2026-03-19 15:33:39.246717037 +0000 UTC m=+1078.475338239" lastFinishedPulling="2026-03-19 15:34:02.112116085 +0000 UTC m=+1101.340737287" observedRunningTime="2026-03-19 15:34:04.294613815 +0000 UTC m=+1103.523235017" watchObservedRunningTime="2026-03-19 15:34:04.296241474 +0000 UTC m=+1103.524862676" Mar 19 15:34:04 crc kubenswrapper[4771]: I0319 15:34:04.311294 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jrbt9" podStartSLOduration=3.388874316 podStartE2EDuration="26.311274007s" podCreationTimestamp="2026-03-19 15:33:38 +0000 UTC" firstStartedPulling="2026-03-19 15:33:39.268837319 +0000 UTC m=+1078.497458521" lastFinishedPulling="2026-03-19 15:34:02.19123701 +0000 UTC m=+1101.419858212" observedRunningTime="2026-03-19 15:34:04.310022726 +0000 UTC m=+1103.538643928" watchObservedRunningTime="2026-03-19 15:34:04.311274007 +0000 UTC m=+1103.539895209" Mar 19 15:34:04 crc kubenswrapper[4771]: I0319 15:34:04.343763 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-w4pll" podStartSLOduration=4.561651154 podStartE2EDuration="27.343746348s" podCreationTimestamp="2026-03-19 15:33:37 +0000 UTC" firstStartedPulling="2026-03-19 15:33:39.23646994 +0000 UTC m=+1078.465091142" lastFinishedPulling="2026-03-19 15:34:02.018565134 +0000 UTC m=+1101.247186336" observedRunningTime="2026-03-19 15:34:04.340980751 +0000 UTC m=+1103.569601943" watchObservedRunningTime="2026-03-19 15:34:04.343746348 +0000 UTC m=+1103.572367540" Mar 19 15:34:07 crc kubenswrapper[4771]: I0319 15:34:07.144705 4771 generic.go:334] "Generic (PLEG): container finished" podID="003252d4-823f-4462-94cf-77521e666cec" containerID="4ab171d2eef64c620ca27a8fe726e6598055217f62665ef2387f45adb4b6a28c" exitCode=0 Mar 19 15:34:07 crc kubenswrapper[4771]: I0319 15:34:07.144790 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r5frl" event={"ID":"003252d4-823f-4462-94cf-77521e666cec","Type":"ContainerDied","Data":"4ab171d2eef64c620ca27a8fe726e6598055217f62665ef2387f45adb4b6a28c"} Mar 19 15:34:07 crc kubenswrapper[4771]: I0319 15:34:07.151027 4771 generic.go:334] "Generic (PLEG): container finished" podID="ed7bd159-532b-4d06-9079-4a04e5a1c3a4" containerID="48d42026c3f1b277a357c562ccc6caf5fa451542a3c29a2b3e52d367b9067165" exitCode=0 Mar 19 15:34:07 crc kubenswrapper[4771]: I0319 15:34:07.151064 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565574-p6k29" event={"ID":"ed7bd159-532b-4d06-9079-4a04e5a1c3a4","Type":"ContainerDied","Data":"48d42026c3f1b277a357c562ccc6caf5fa451542a3c29a2b3e52d367b9067165"} Mar 19 15:34:07 crc kubenswrapper[4771]: I0319 15:34:07.634913 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-x9hpx" Mar 19 15:34:07 crc kubenswrapper[4771]: I0319 15:34:07.687461 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-d4wwp" Mar 19 15:34:07 crc kubenswrapper[4771]: I0319 15:34:07.803326 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hgvwc" Mar 19 15:34:07 crc kubenswrapper[4771]: I0319 15:34:07.834290 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-5xksc" Mar 19 15:34:07 crc kubenswrapper[4771]: I0319 15:34:07.947010 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-xllmw" Mar 19 15:34:08 crc kubenswrapper[4771]: I0319 15:34:08.088920 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-mldfm" Mar 19 15:34:08 crc kubenswrapper[4771]: I0319 15:34:08.159938 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-7p6l9" event={"ID":"f10d9b25-6f62-4300-9827-bebe80433dda","Type":"ContainerStarted","Data":"93887f1b3baee11531ae23715c52df59acdc3d5224801c68afe359e0c71685ad"} Mar 19 15:34:08 crc kubenswrapper[4771]: I0319 15:34:08.160181 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-7p6l9" Mar 19 15:34:08 crc kubenswrapper[4771]: I0319 15:34:08.168527 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r5frl" event={"ID":"003252d4-823f-4462-94cf-77521e666cec","Type":"ContainerStarted","Data":"69249d76a298a6f10961c00dea8a5a02b4ccb0a9c35288d825230c1154e070bd"} Mar 19 15:34:08 crc kubenswrapper[4771]: I0319 15:34:08.185052 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-6s7qk" Mar 19 15:34:08 crc kubenswrapper[4771]: I0319 15:34:08.193661 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-7p6l9" podStartSLOduration=3.1167460240000002 podStartE2EDuration="31.19364347s" podCreationTimestamp="2026-03-19 15:33:37 +0000 UTC" firstStartedPulling="2026-03-19 15:33:39.217388351 +0000 UTC m=+1078.446009553" lastFinishedPulling="2026-03-19 15:34:07.294285757 +0000 UTC m=+1106.522906999" observedRunningTime="2026-03-19 15:34:08.18822772 +0000 UTC m=+1107.416848942" watchObservedRunningTime="2026-03-19 15:34:08.19364347 +0000 UTC m=+1107.422264672" Mar 19 15:34:08 crc kubenswrapper[4771]: I0319 15:34:08.209743 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r5frl" podStartSLOduration=21.651268395 podStartE2EDuration="26.209720645s" podCreationTimestamp="2026-03-19 15:33:42 +0000 UTC" firstStartedPulling="2026-03-19 15:34:03.087778738 +0000 UTC m=+1102.316399940" lastFinishedPulling="2026-03-19 15:34:07.646230968 +0000 UTC m=+1106.874852190" observedRunningTime="2026-03-19 15:34:08.207375988 +0000 UTC m=+1107.435997190" watchObservedRunningTime="2026-03-19 15:34:08.209720645 +0000 UTC m=+1107.438341847" Mar 19 15:34:08 crc kubenswrapper[4771]: I0319 15:34:08.227679 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-rbjxs" Mar 19 15:34:08 crc kubenswrapper[4771]: I0319 15:34:08.380737 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-zcdvr" Mar 19 15:34:08 crc kubenswrapper[4771]: I0319 15:34:08.461105 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-g6nkz" Mar 19 15:34:08 crc kubenswrapper[4771]: I0319 15:34:08.476433 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-p4bjm" Mar 19 15:34:08 crc kubenswrapper[4771]: I0319 15:34:08.523860 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565574-p6k29" Mar 19 15:34:08 crc kubenswrapper[4771]: I0319 15:34:08.609063 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkzd8\" (UniqueName: \"kubernetes.io/projected/ed7bd159-532b-4d06-9079-4a04e5a1c3a4-kube-api-access-qkzd8\") pod \"ed7bd159-532b-4d06-9079-4a04e5a1c3a4\" (UID: \"ed7bd159-532b-4d06-9079-4a04e5a1c3a4\") " Mar 19 15:34:08 crc kubenswrapper[4771]: I0319 15:34:08.621500 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed7bd159-532b-4d06-9079-4a04e5a1c3a4-kube-api-access-qkzd8" (OuterVolumeSpecName: "kube-api-access-qkzd8") pod "ed7bd159-532b-4d06-9079-4a04e5a1c3a4" (UID: "ed7bd159-532b-4d06-9079-4a04e5a1c3a4"). InnerVolumeSpecName "kube-api-access-qkzd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:34:08 crc kubenswrapper[4771]: I0319 15:34:08.710707 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkzd8\" (UniqueName: \"kubernetes.io/projected/ed7bd159-532b-4d06-9079-4a04e5a1c3a4-kube-api-access-qkzd8\") on node \"crc\" DevicePath \"\"" Mar 19 15:34:09 crc kubenswrapper[4771]: I0319 15:34:09.177807 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565574-p6k29" event={"ID":"ed7bd159-532b-4d06-9079-4a04e5a1c3a4","Type":"ContainerDied","Data":"5da1504a7e3ae8c02ae7d15754ec26eeff326a997405b77c41c42791aaa9320d"} Mar 19 15:34:09 crc kubenswrapper[4771]: I0319 15:34:09.177879 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5da1504a7e3ae8c02ae7d15754ec26eeff326a997405b77c41c42791aaa9320d" Mar 19 15:34:09 crc kubenswrapper[4771]: I0319 15:34:09.177886 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565574-p6k29" Mar 19 15:34:09 crc kubenswrapper[4771]: I0319 15:34:09.318698 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992-cert\") pod \"infra-operator-controller-manager-7b487c85ff-rdr25\" (UID: \"6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992\") " pod="openstack-operators/infra-operator-controller-manager-7b487c85ff-rdr25" Mar 19 15:34:09 crc kubenswrapper[4771]: I0319 15:34:09.328836 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992-cert\") pod \"infra-operator-controller-manager-7b487c85ff-rdr25\" (UID: \"6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992\") " pod="openstack-operators/infra-operator-controller-manager-7b487c85ff-rdr25" Mar 19 15:34:09 crc kubenswrapper[4771]: I0319 15:34:09.346632 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-npzxs" Mar 19 15:34:09 crc kubenswrapper[4771]: I0319 15:34:09.355207 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b487c85ff-rdr25" Mar 19 15:34:09 crc kubenswrapper[4771]: I0319 15:34:09.618821 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565568-fwdhr"] Mar 19 15:34:09 crc kubenswrapper[4771]: I0319 15:34:09.619926 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565568-fwdhr"] Mar 19 15:34:09 crc kubenswrapper[4771]: I0319 15:34:09.725144 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ccc7b81-b1de-48e8-aec9-f88f615ebf88-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-4fwmj\" (UID: \"7ccc7b81-b1de-48e8-aec9-f88f615ebf88\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4fwmj" Mar 19 15:34:09 crc kubenswrapper[4771]: I0319 15:34:09.731417 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ccc7b81-b1de-48e8-aec9-f88f615ebf88-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-4fwmj\" (UID: \"7ccc7b81-b1de-48e8-aec9-f88f615ebf88\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4fwmj" Mar 19 15:34:09 crc kubenswrapper[4771]: I0319 15:34:09.824903 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b487c85ff-rdr25"] Mar 19 15:34:09 crc kubenswrapper[4771]: I0319 15:34:09.942773 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-g9kn4" Mar 19 15:34:09 crc kubenswrapper[4771]: I0319 15:34:09.951413 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4fwmj" Mar 19 15:34:10 crc kubenswrapper[4771]: I0319 15:34:10.132716 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-webhook-certs\") pod \"openstack-operator-controller-manager-75b679d9b8-4nqx6\" (UID: \"3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d\") " pod="openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6" Mar 19 15:34:10 crc kubenswrapper[4771]: I0319 15:34:10.132762 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-metrics-certs\") pod \"openstack-operator-controller-manager-75b679d9b8-4nqx6\" (UID: \"3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d\") " pod="openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6" Mar 19 15:34:10 crc kubenswrapper[4771]: I0319 15:34:10.137452 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-webhook-certs\") pod \"openstack-operator-controller-manager-75b679d9b8-4nqx6\" (UID: \"3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d\") " pod="openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6" Mar 19 15:34:10 crc kubenswrapper[4771]: I0319 15:34:10.137899 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d-metrics-certs\") pod \"openstack-operator-controller-manager-75b679d9b8-4nqx6\" (UID: \"3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d\") " pod="openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6" Mar 19 15:34:10 crc kubenswrapper[4771]: I0319 15:34:10.154951 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4fwmj"] Mar 19 15:34:10 crc kubenswrapper[4771]: W0319 15:34:10.160854 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ccc7b81_b1de_48e8_aec9_f88f615ebf88.slice/crio-a486bef053615abf9ed218f953086160b54248eaf0257682565d2d151adc93ff WatchSource:0}: Error finding container a486bef053615abf9ed218f953086160b54248eaf0257682565d2d151adc93ff: Status 404 returned error can't find the container with id a486bef053615abf9ed218f953086160b54248eaf0257682565d2d151adc93ff Mar 19 15:34:10 crc kubenswrapper[4771]: I0319 15:34:10.184437 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b487c85ff-rdr25" event={"ID":"6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992","Type":"ContainerStarted","Data":"00537213962e629214874d70395b44415761f08a6a4126f118bac35757cba5a8"} Mar 19 15:34:10 crc kubenswrapper[4771]: I0319 15:34:10.186028 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4fwmj" event={"ID":"7ccc7b81-b1de-48e8-aec9-f88f615ebf88","Type":"ContainerStarted","Data":"a486bef053615abf9ed218f953086160b54248eaf0257682565d2d151adc93ff"} Mar 19 15:34:10 crc kubenswrapper[4771]: I0319 15:34:10.292772 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-nz5nr" Mar 19 15:34:10 crc kubenswrapper[4771]: I0319 15:34:10.301656 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6" Mar 19 15:34:10 crc kubenswrapper[4771]: I0319 15:34:10.550935 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6"] Mar 19 15:34:10 crc kubenswrapper[4771]: W0319 15:34:10.558522 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fdbf2a8_2d30_446e_9aae_a7a00f4efd0d.slice/crio-a9ef8898a66643bdb2346c34c0a509c37c22b425634242699069cf1cc02a8be5 WatchSource:0}: Error finding container a9ef8898a66643bdb2346c34c0a509c37c22b425634242699069cf1cc02a8be5: Status 404 returned error can't find the container with id a9ef8898a66643bdb2346c34c0a509c37c22b425634242699069cf1cc02a8be5 Mar 19 15:34:11 crc kubenswrapper[4771]: I0319 15:34:11.198179 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6" event={"ID":"3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d","Type":"ContainerStarted","Data":"a8cb252407d2bde708e4bba4a0bd274200f83406dc3d29dedd99a3ef6afb73a4"} Mar 19 15:34:11 crc kubenswrapper[4771]: I0319 15:34:11.198634 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6" event={"ID":"3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d","Type":"ContainerStarted","Data":"a9ef8898a66643bdb2346c34c0a509c37c22b425634242699069cf1cc02a8be5"} Mar 19 15:34:11 crc kubenswrapper[4771]: I0319 15:34:11.198657 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6" Mar 19 15:34:11 crc kubenswrapper[4771]: I0319 15:34:11.244762 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6" podStartSLOduration=33.244741044 podStartE2EDuration="33.244741044s" podCreationTimestamp="2026-03-19 15:33:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:34:11.230702507 +0000 UTC m=+1110.459323709" watchObservedRunningTime="2026-03-19 15:34:11.244741044 +0000 UTC m=+1110.473362246" Mar 19 15:34:11 crc kubenswrapper[4771]: I0319 15:34:11.524724 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68217883-a964-4b84-880a-6ac714e5e58e" path="/var/lib/kubelet/pods/68217883-a964-4b84-880a-6ac714e5e58e/volumes" Mar 19 15:34:12 crc kubenswrapper[4771]: I0319 15:34:12.849786 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r5frl" Mar 19 15:34:12 crc kubenswrapper[4771]: I0319 15:34:12.850222 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r5frl" Mar 19 15:34:12 crc kubenswrapper[4771]: I0319 15:34:12.924498 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r5frl" Mar 19 15:34:13 crc kubenswrapper[4771]: I0319 15:34:13.227617 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b487c85ff-rdr25" event={"ID":"6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992","Type":"ContainerStarted","Data":"8ab0acf140de552450c422c6083a78f2e89d1bd8bf5a218126093bf02b7c9da7"} Mar 19 15:34:13 crc kubenswrapper[4771]: I0319 15:34:13.227736 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b487c85ff-rdr25" Mar 19 15:34:13 crc kubenswrapper[4771]: I0319 15:34:13.229322 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-xhqbn" event={"ID":"14fa11c5-1371-4da1-aa9b-8b7b2463600e","Type":"ContainerStarted","Data":"0029bd0e67a762d41cb248ae8749247e70554dfafcf5484cf73d791fac1fa36d"} Mar 19 15:34:13 crc kubenswrapper[4771]: I0319 15:34:13.229495 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-xhqbn" Mar 19 15:34:13 crc kubenswrapper[4771]: I0319 15:34:13.230526 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5zj7h" event={"ID":"051674d9-53cb-4cbc-ae54-b6beb16456ee","Type":"ContainerStarted","Data":"e4423b6c106876bc0323afd26fc9be0d0a0884b23660e8636e6664498fb15c0a"} Mar 19 15:34:13 crc kubenswrapper[4771]: I0319 15:34:13.230660 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5zj7h" Mar 19 15:34:13 crc kubenswrapper[4771]: I0319 15:34:13.231878 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-r479m" event={"ID":"2a2f4027-c0c8-4032-9e40-ab2ce99c899f","Type":"ContainerStarted","Data":"79d66e060d8ff368731363e3cf75ea6886a0a56cdf3815da61468c41a411e82e"} Mar 19 15:34:13 crc kubenswrapper[4771]: I0319 15:34:13.232138 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-r479m" Mar 19 15:34:13 crc kubenswrapper[4771]: I0319 15:34:13.265516 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-xhqbn" podStartSLOduration=3.14099012 podStartE2EDuration="36.265497716s" podCreationTimestamp="2026-03-19 15:33:37 +0000 UTC" firstStartedPulling="2026-03-19 15:33:39.17867886 +0000 UTC m=+1078.407300072" lastFinishedPulling="2026-03-19 15:34:12.303186436 +0000 UTC m=+1111.531807668" observedRunningTime="2026-03-19 15:34:13.262124396 +0000 UTC m=+1112.490745598" watchObservedRunningTime="2026-03-19 15:34:13.265497716 +0000 UTC m=+1112.494118918" Mar 19 15:34:13 crc kubenswrapper[4771]: I0319 15:34:13.266655 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b487c85ff-rdr25" podStartSLOduration=34.019169649 podStartE2EDuration="36.266650264s" podCreationTimestamp="2026-03-19 15:33:37 +0000 UTC" firstStartedPulling="2026-03-19 15:34:09.842759141 +0000 UTC m=+1109.071380343" lastFinishedPulling="2026-03-19 15:34:12.090239756 +0000 UTC m=+1111.318860958" observedRunningTime="2026-03-19 15:34:13.246183184 +0000 UTC m=+1112.474804376" watchObservedRunningTime="2026-03-19 15:34:13.266650264 +0000 UTC m=+1112.495271466" Mar 19 15:34:13 crc kubenswrapper[4771]: I0319 15:34:13.279465 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r5frl" Mar 19 15:34:13 crc kubenswrapper[4771]: I0319 15:34:13.280703 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5zj7h" podStartSLOduration=3.151821 podStartE2EDuration="36.280692571s" podCreationTimestamp="2026-03-19 15:33:37 +0000 UTC" firstStartedPulling="2026-03-19 15:33:39.179084859 +0000 UTC m=+1078.407706061" lastFinishedPulling="2026-03-19 15:34:12.30795644 +0000 UTC m=+1111.536577632" observedRunningTime="2026-03-19 15:34:13.275600518 +0000 UTC m=+1112.504221730" watchObservedRunningTime="2026-03-19 15:34:13.280692571 +0000 UTC m=+1112.509313773" Mar 19 15:34:13 crc kubenswrapper[4771]: I0319 15:34:13.292690 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-r479m" podStartSLOduration=3.126029377 podStartE2EDuration="36.292630226s" podCreationTimestamp="2026-03-19 15:33:37 +0000 UTC" firstStartedPulling="2026-03-19 15:33:39.230541188 +0000 UTC m=+1078.459162390" lastFinishedPulling="2026-03-19 15:34:12.397142047 +0000 UTC m=+1111.625763239" observedRunningTime="2026-03-19 15:34:13.290396943 +0000 UTC m=+1112.519018145" watchObservedRunningTime="2026-03-19 15:34:13.292630226 +0000 UTC m=+1112.521251448" Mar 19 15:34:13 crc kubenswrapper[4771]: I0319 15:34:13.714872 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r5frl"] Mar 19 15:34:14 crc kubenswrapper[4771]: I0319 15:34:14.241141 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4fwmj" event={"ID":"7ccc7b81-b1de-48e8-aec9-f88f615ebf88","Type":"ContainerStarted","Data":"acc6e5f86563cf9d06c64e51e65c6d32c17417d728808cc4f44fb3dd3d5366b2"} Mar 19 15:34:14 crc kubenswrapper[4771]: I0319 15:34:14.285777 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4fwmj" podStartSLOduration=33.404839645 podStartE2EDuration="37.285750215s" podCreationTimestamp="2026-03-19 15:33:37 +0000 UTC" firstStartedPulling="2026-03-19 15:34:10.166825594 +0000 UTC m=+1109.395446796" lastFinishedPulling="2026-03-19 15:34:14.047736164 +0000 UTC m=+1113.276357366" observedRunningTime="2026-03-19 15:34:14.277188529 +0000 UTC m=+1113.505809771" watchObservedRunningTime="2026-03-19 15:34:14.285750215 +0000 UTC m=+1113.514371427" Mar 19 15:34:15 crc kubenswrapper[4771]: I0319 15:34:15.248921 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r5frl" podUID="003252d4-823f-4462-94cf-77521e666cec" containerName="registry-server" containerID="cri-o://69249d76a298a6f10961c00dea8a5a02b4ccb0a9c35288d825230c1154e070bd" gracePeriod=2 Mar 19 15:34:15 crc kubenswrapper[4771]: I0319 15:34:15.249163 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4fwmj" Mar 19 15:34:15 crc kubenswrapper[4771]: I0319 15:34:15.685757 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r5frl" Mar 19 15:34:15 crc kubenswrapper[4771]: I0319 15:34:15.819895 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/003252d4-823f-4462-94cf-77521e666cec-utilities\") pod \"003252d4-823f-4462-94cf-77521e666cec\" (UID: \"003252d4-823f-4462-94cf-77521e666cec\") " Mar 19 15:34:15 crc kubenswrapper[4771]: I0319 15:34:15.819964 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/003252d4-823f-4462-94cf-77521e666cec-catalog-content\") pod \"003252d4-823f-4462-94cf-77521e666cec\" (UID: \"003252d4-823f-4462-94cf-77521e666cec\") " Mar 19 15:34:15 crc kubenswrapper[4771]: I0319 15:34:15.820069 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgxrh\" (UniqueName: \"kubernetes.io/projected/003252d4-823f-4462-94cf-77521e666cec-kube-api-access-cgxrh\") pod \"003252d4-823f-4462-94cf-77521e666cec\" (UID: \"003252d4-823f-4462-94cf-77521e666cec\") " Mar 19 15:34:15 crc kubenswrapper[4771]: I0319 15:34:15.822424 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/003252d4-823f-4462-94cf-77521e666cec-utilities" (OuterVolumeSpecName: "utilities") pod "003252d4-823f-4462-94cf-77521e666cec" (UID: "003252d4-823f-4462-94cf-77521e666cec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:34:15 crc kubenswrapper[4771]: I0319 15:34:15.825481 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/003252d4-823f-4462-94cf-77521e666cec-kube-api-access-cgxrh" (OuterVolumeSpecName: "kube-api-access-cgxrh") pod "003252d4-823f-4462-94cf-77521e666cec" (UID: "003252d4-823f-4462-94cf-77521e666cec"). InnerVolumeSpecName "kube-api-access-cgxrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:34:15 crc kubenswrapper[4771]: I0319 15:34:15.874910 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/003252d4-823f-4462-94cf-77521e666cec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "003252d4-823f-4462-94cf-77521e666cec" (UID: "003252d4-823f-4462-94cf-77521e666cec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:34:15 crc kubenswrapper[4771]: I0319 15:34:15.921610 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/003252d4-823f-4462-94cf-77521e666cec-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 15:34:15 crc kubenswrapper[4771]: I0319 15:34:15.921652 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/003252d4-823f-4462-94cf-77521e666cec-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 15:34:15 crc kubenswrapper[4771]: I0319 15:34:15.921668 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgxrh\" (UniqueName: \"kubernetes.io/projected/003252d4-823f-4462-94cf-77521e666cec-kube-api-access-cgxrh\") on node \"crc\" DevicePath \"\"" Mar 19 15:34:16 crc kubenswrapper[4771]: I0319 15:34:16.263258 4771 generic.go:334] "Generic (PLEG): container finished" podID="003252d4-823f-4462-94cf-77521e666cec" containerID="69249d76a298a6f10961c00dea8a5a02b4ccb0a9c35288d825230c1154e070bd" exitCode=0 Mar 19 15:34:16 crc kubenswrapper[4771]: I0319 15:34:16.263349 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r5frl" Mar 19 15:34:16 crc kubenswrapper[4771]: I0319 15:34:16.263350 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r5frl" event={"ID":"003252d4-823f-4462-94cf-77521e666cec","Type":"ContainerDied","Data":"69249d76a298a6f10961c00dea8a5a02b4ccb0a9c35288d825230c1154e070bd"} Mar 19 15:34:16 crc kubenswrapper[4771]: I0319 15:34:16.263438 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r5frl" event={"ID":"003252d4-823f-4462-94cf-77521e666cec","Type":"ContainerDied","Data":"4ce35f0ee8dcaa35a4fedd670ae0f4c653d880b9323c49bb7bcbe190156586e6"} Mar 19 15:34:16 crc kubenswrapper[4771]: I0319 15:34:16.263486 4771 scope.go:117] "RemoveContainer" containerID="69249d76a298a6f10961c00dea8a5a02b4ccb0a9c35288d825230c1154e070bd" Mar 19 15:34:16 crc kubenswrapper[4771]: I0319 15:34:16.319332 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r5frl"] Mar 19 15:34:16 crc kubenswrapper[4771]: I0319 15:34:16.330206 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r5frl"] Mar 19 15:34:16 crc kubenswrapper[4771]: I0319 15:34:16.827077 4771 scope.go:117] "RemoveContainer" containerID="4ab171d2eef64c620ca27a8fe726e6598055217f62665ef2387f45adb4b6a28c" Mar 19 15:34:16 crc kubenswrapper[4771]: I0319 15:34:16.849705 4771 scope.go:117] "RemoveContainer" containerID="46d835e3af7b8c415a2bf4b35772e51db7be38eb163a26035a1bbf4eac04f553" Mar 19 15:34:16 crc kubenswrapper[4771]: I0319 15:34:16.869280 4771 scope.go:117] "RemoveContainer" containerID="69249d76a298a6f10961c00dea8a5a02b4ccb0a9c35288d825230c1154e070bd" Mar 19 15:34:16 crc kubenswrapper[4771]: E0319 15:34:16.869780 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69249d76a298a6f10961c00dea8a5a02b4ccb0a9c35288d825230c1154e070bd\": container with ID starting with 69249d76a298a6f10961c00dea8a5a02b4ccb0a9c35288d825230c1154e070bd not found: ID does not exist" containerID="69249d76a298a6f10961c00dea8a5a02b4ccb0a9c35288d825230c1154e070bd" Mar 19 15:34:16 crc kubenswrapper[4771]: I0319 15:34:16.869815 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69249d76a298a6f10961c00dea8a5a02b4ccb0a9c35288d825230c1154e070bd"} err="failed to get container status \"69249d76a298a6f10961c00dea8a5a02b4ccb0a9c35288d825230c1154e070bd\": rpc error: code = NotFound desc = could not find container \"69249d76a298a6f10961c00dea8a5a02b4ccb0a9c35288d825230c1154e070bd\": container with ID starting with 69249d76a298a6f10961c00dea8a5a02b4ccb0a9c35288d825230c1154e070bd not found: ID does not exist" Mar 19 15:34:16 crc kubenswrapper[4771]: I0319 15:34:16.869839 4771 scope.go:117] "RemoveContainer" containerID="4ab171d2eef64c620ca27a8fe726e6598055217f62665ef2387f45adb4b6a28c" Mar 19 15:34:16 crc kubenswrapper[4771]: E0319 15:34:16.870288 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ab171d2eef64c620ca27a8fe726e6598055217f62665ef2387f45adb4b6a28c\": container with ID starting with 4ab171d2eef64c620ca27a8fe726e6598055217f62665ef2387f45adb4b6a28c not found: ID does not exist" containerID="4ab171d2eef64c620ca27a8fe726e6598055217f62665ef2387f45adb4b6a28c" Mar 19 15:34:16 crc kubenswrapper[4771]: I0319 15:34:16.870327 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ab171d2eef64c620ca27a8fe726e6598055217f62665ef2387f45adb4b6a28c"} err="failed to get container status \"4ab171d2eef64c620ca27a8fe726e6598055217f62665ef2387f45adb4b6a28c\": rpc error: code = NotFound desc = could not find container \"4ab171d2eef64c620ca27a8fe726e6598055217f62665ef2387f45adb4b6a28c\": container with ID starting with 4ab171d2eef64c620ca27a8fe726e6598055217f62665ef2387f45adb4b6a28c not found: ID does not exist" Mar 19 15:34:16 crc kubenswrapper[4771]: I0319 15:34:16.870352 4771 scope.go:117] "RemoveContainer" containerID="46d835e3af7b8c415a2bf4b35772e51db7be38eb163a26035a1bbf4eac04f553" Mar 19 15:34:16 crc kubenswrapper[4771]: E0319 15:34:16.870710 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46d835e3af7b8c415a2bf4b35772e51db7be38eb163a26035a1bbf4eac04f553\": container with ID starting with 46d835e3af7b8c415a2bf4b35772e51db7be38eb163a26035a1bbf4eac04f553 not found: ID does not exist" containerID="46d835e3af7b8c415a2bf4b35772e51db7be38eb163a26035a1bbf4eac04f553" Mar 19 15:34:16 crc kubenswrapper[4771]: I0319 15:34:16.870736 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46d835e3af7b8c415a2bf4b35772e51db7be38eb163a26035a1bbf4eac04f553"} err="failed to get container status \"46d835e3af7b8c415a2bf4b35772e51db7be38eb163a26035a1bbf4eac04f553\": rpc error: code = NotFound desc = could not find container \"46d835e3af7b8c415a2bf4b35772e51db7be38eb163a26035a1bbf4eac04f553\": container with ID starting with 46d835e3af7b8c415a2bf4b35772e51db7be38eb163a26035a1bbf4eac04f553 not found: ID does not exist" Mar 19 15:34:17 crc kubenswrapper[4771]: I0319 15:34:17.524015 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="003252d4-823f-4462-94cf-77521e666cec" path="/var/lib/kubelet/pods/003252d4-823f-4462-94cf-77521e666cec/volumes" Mar 19 15:34:17 crc kubenswrapper[4771]: I0319 15:34:17.697449 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-9mp28" Mar 19 15:34:17 crc kubenswrapper[4771]: I0319 15:34:17.756736 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-7p6l9" Mar 19 15:34:17 crc kubenswrapper[4771]: I0319 15:34:17.918105 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-knffk" Mar 19 15:34:17 crc kubenswrapper[4771]: I0319 15:34:17.947724 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-t7mdc" Mar 19 15:34:17 crc kubenswrapper[4771]: I0319 15:34:17.968714 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-w4pll" Mar 19 15:34:18 crc kubenswrapper[4771]: I0319 15:34:18.107125 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5zj7h" Mar 19 15:34:18 crc kubenswrapper[4771]: I0319 15:34:18.129682 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-xhqbn" Mar 19 15:34:18 crc kubenswrapper[4771]: I0319 15:34:18.290203 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-r479m" Mar 19 15:34:19 crc kubenswrapper[4771]: I0319 15:34:19.365625 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b487c85ff-rdr25" Mar 19 15:34:19 crc kubenswrapper[4771]: I0319 15:34:19.960087 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4fwmj" Mar 19 15:34:20 crc kubenswrapper[4771]: I0319 15:34:20.309762 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-75b679d9b8-4nqx6" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.041391 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-hrhlg"] Mar 19 15:34:36 crc kubenswrapper[4771]: E0319 15:34:36.042273 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="003252d4-823f-4462-94cf-77521e666cec" containerName="extract-utilities" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.042288 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="003252d4-823f-4462-94cf-77521e666cec" containerName="extract-utilities" Mar 19 15:34:36 crc kubenswrapper[4771]: E0319 15:34:36.042303 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed7bd159-532b-4d06-9079-4a04e5a1c3a4" containerName="oc" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.042311 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed7bd159-532b-4d06-9079-4a04e5a1c3a4" containerName="oc" Mar 19 15:34:36 crc kubenswrapper[4771]: E0319 15:34:36.042328 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="003252d4-823f-4462-94cf-77521e666cec" containerName="extract-content" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.042337 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="003252d4-823f-4462-94cf-77521e666cec" containerName="extract-content" Mar 19 15:34:36 crc kubenswrapper[4771]: E0319 15:34:36.042353 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="003252d4-823f-4462-94cf-77521e666cec" containerName="registry-server" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.042361 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="003252d4-823f-4462-94cf-77521e666cec" containerName="registry-server" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.042536 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="003252d4-823f-4462-94cf-77521e666cec" containerName="registry-server" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.042548 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed7bd159-532b-4d06-9079-4a04e5a1c3a4" containerName="oc" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.044328 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-hrhlg" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.047348 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.047656 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-nwlf2" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.048006 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.048081 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.066746 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-hrhlg"] Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.112707 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vj2zl"] Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.114744 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-vj2zl" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.117294 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.127725 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vj2zl"] Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.133437 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgp25\" (UniqueName: \"kubernetes.io/projected/62ca8fe5-562c-49e7-b6ea-018eb0bf9739-kube-api-access-bgp25\") pod \"dnsmasq-dns-675f4bcbfc-hrhlg\" (UID: \"62ca8fe5-562c-49e7-b6ea-018eb0bf9739\") " pod="openstack/dnsmasq-dns-675f4bcbfc-hrhlg" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.133569 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62ca8fe5-562c-49e7-b6ea-018eb0bf9739-config\") pod \"dnsmasq-dns-675f4bcbfc-hrhlg\" (UID: \"62ca8fe5-562c-49e7-b6ea-018eb0bf9739\") " pod="openstack/dnsmasq-dns-675f4bcbfc-hrhlg" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.235261 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07e54df0-321a-4a59-800e-1bdfdb4b6eaf-config\") pod \"dnsmasq-dns-78dd6ddcc-vj2zl\" (UID: \"07e54df0-321a-4a59-800e-1bdfdb4b6eaf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vj2zl" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.235337 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62ca8fe5-562c-49e7-b6ea-018eb0bf9739-config\") pod \"dnsmasq-dns-675f4bcbfc-hrhlg\" (UID: \"62ca8fe5-562c-49e7-b6ea-018eb0bf9739\") " pod="openstack/dnsmasq-dns-675f4bcbfc-hrhlg" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.235371 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5lqk\" (UniqueName: \"kubernetes.io/projected/07e54df0-321a-4a59-800e-1bdfdb4b6eaf-kube-api-access-k5lqk\") pod \"dnsmasq-dns-78dd6ddcc-vj2zl\" (UID: \"07e54df0-321a-4a59-800e-1bdfdb4b6eaf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vj2zl" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.235394 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07e54df0-321a-4a59-800e-1bdfdb4b6eaf-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-vj2zl\" (UID: \"07e54df0-321a-4a59-800e-1bdfdb4b6eaf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vj2zl" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.235421 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgp25\" (UniqueName: \"kubernetes.io/projected/62ca8fe5-562c-49e7-b6ea-018eb0bf9739-kube-api-access-bgp25\") pod \"dnsmasq-dns-675f4bcbfc-hrhlg\" (UID: \"62ca8fe5-562c-49e7-b6ea-018eb0bf9739\") " pod="openstack/dnsmasq-dns-675f4bcbfc-hrhlg" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.236271 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62ca8fe5-562c-49e7-b6ea-018eb0bf9739-config\") pod \"dnsmasq-dns-675f4bcbfc-hrhlg\" (UID: \"62ca8fe5-562c-49e7-b6ea-018eb0bf9739\") " pod="openstack/dnsmasq-dns-675f4bcbfc-hrhlg" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.262898 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgp25\" (UniqueName: \"kubernetes.io/projected/62ca8fe5-562c-49e7-b6ea-018eb0bf9739-kube-api-access-bgp25\") pod \"dnsmasq-dns-675f4bcbfc-hrhlg\" (UID: \"62ca8fe5-562c-49e7-b6ea-018eb0bf9739\") " pod="openstack/dnsmasq-dns-675f4bcbfc-hrhlg" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.336588 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07e54df0-321a-4a59-800e-1bdfdb4b6eaf-config\") pod \"dnsmasq-dns-78dd6ddcc-vj2zl\" (UID: \"07e54df0-321a-4a59-800e-1bdfdb4b6eaf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vj2zl" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.337588 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07e54df0-321a-4a59-800e-1bdfdb4b6eaf-config\") pod \"dnsmasq-dns-78dd6ddcc-vj2zl\" (UID: \"07e54df0-321a-4a59-800e-1bdfdb4b6eaf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vj2zl" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.337767 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5lqk\" (UniqueName: \"kubernetes.io/projected/07e54df0-321a-4a59-800e-1bdfdb4b6eaf-kube-api-access-k5lqk\") pod \"dnsmasq-dns-78dd6ddcc-vj2zl\" (UID: \"07e54df0-321a-4a59-800e-1bdfdb4b6eaf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vj2zl" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.338142 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07e54df0-321a-4a59-800e-1bdfdb4b6eaf-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-vj2zl\" (UID: \"07e54df0-321a-4a59-800e-1bdfdb4b6eaf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vj2zl" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.338749 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07e54df0-321a-4a59-800e-1bdfdb4b6eaf-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-vj2zl\" (UID: \"07e54df0-321a-4a59-800e-1bdfdb4b6eaf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vj2zl" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.354292 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5lqk\" (UniqueName: \"kubernetes.io/projected/07e54df0-321a-4a59-800e-1bdfdb4b6eaf-kube-api-access-k5lqk\") pod \"dnsmasq-dns-78dd6ddcc-vj2zl\" (UID: \"07e54df0-321a-4a59-800e-1bdfdb4b6eaf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vj2zl" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.365932 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-hrhlg" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.433951 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-vj2zl" Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.690055 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vj2zl"] Mar 19 15:34:36 crc kubenswrapper[4771]: I0319 15:34:36.849420 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-hrhlg"] Mar 19 15:34:37 crc kubenswrapper[4771]: I0319 15:34:37.415681 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-hrhlg" event={"ID":"62ca8fe5-562c-49e7-b6ea-018eb0bf9739","Type":"ContainerStarted","Data":"ee691fb9efcba437180f6d47f3dd10dc9469d8e85340631ec305c33a80a1f5a4"} Mar 19 15:34:37 crc kubenswrapper[4771]: I0319 15:34:37.417718 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-vj2zl" event={"ID":"07e54df0-321a-4a59-800e-1bdfdb4b6eaf","Type":"ContainerStarted","Data":"1b21cfcad079ee83f28efa1ab8c308841fd6e5abb050c66aaeb0ff8233b3c188"} Mar 19 15:34:38 crc kubenswrapper[4771]: I0319 15:34:38.748318 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-hrhlg"] Mar 19 15:34:38 crc kubenswrapper[4771]: I0319 15:34:38.784785 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jc5w2"] Mar 19 15:34:38 crc kubenswrapper[4771]: I0319 15:34:38.787186 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-jc5w2" Mar 19 15:34:38 crc kubenswrapper[4771]: I0319 15:34:38.807158 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jc5w2"] Mar 19 15:34:38 crc kubenswrapper[4771]: I0319 15:34:38.887449 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/511c87f1-6ccf-4c31-bc90-73af11c879e7-config\") pod \"dnsmasq-dns-666b6646f7-jc5w2\" (UID: \"511c87f1-6ccf-4c31-bc90-73af11c879e7\") " pod="openstack/dnsmasq-dns-666b6646f7-jc5w2" Mar 19 15:34:38 crc kubenswrapper[4771]: I0319 15:34:38.887526 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw25d\" (UniqueName: \"kubernetes.io/projected/511c87f1-6ccf-4c31-bc90-73af11c879e7-kube-api-access-nw25d\") pod \"dnsmasq-dns-666b6646f7-jc5w2\" (UID: \"511c87f1-6ccf-4c31-bc90-73af11c879e7\") " pod="openstack/dnsmasq-dns-666b6646f7-jc5w2" Mar 19 15:34:38 crc kubenswrapper[4771]: I0319 15:34:38.887605 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/511c87f1-6ccf-4c31-bc90-73af11c879e7-dns-svc\") pod \"dnsmasq-dns-666b6646f7-jc5w2\" (UID: \"511c87f1-6ccf-4c31-bc90-73af11c879e7\") " pod="openstack/dnsmasq-dns-666b6646f7-jc5w2" Mar 19 15:34:38 crc kubenswrapper[4771]: I0319 15:34:38.988976 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/511c87f1-6ccf-4c31-bc90-73af11c879e7-dns-svc\") pod \"dnsmasq-dns-666b6646f7-jc5w2\" (UID: \"511c87f1-6ccf-4c31-bc90-73af11c879e7\") " pod="openstack/dnsmasq-dns-666b6646f7-jc5w2" Mar 19 15:34:38 crc kubenswrapper[4771]: I0319 15:34:38.989069 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/511c87f1-6ccf-4c31-bc90-73af11c879e7-config\") pod \"dnsmasq-dns-666b6646f7-jc5w2\" (UID: \"511c87f1-6ccf-4c31-bc90-73af11c879e7\") " pod="openstack/dnsmasq-dns-666b6646f7-jc5w2" Mar 19 15:34:38 crc kubenswrapper[4771]: I0319 15:34:38.989113 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw25d\" (UniqueName: \"kubernetes.io/projected/511c87f1-6ccf-4c31-bc90-73af11c879e7-kube-api-access-nw25d\") pod \"dnsmasq-dns-666b6646f7-jc5w2\" (UID: \"511c87f1-6ccf-4c31-bc90-73af11c879e7\") " pod="openstack/dnsmasq-dns-666b6646f7-jc5w2" Mar 19 15:34:38 crc kubenswrapper[4771]: I0319 15:34:38.990479 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/511c87f1-6ccf-4c31-bc90-73af11c879e7-dns-svc\") pod \"dnsmasq-dns-666b6646f7-jc5w2\" (UID: \"511c87f1-6ccf-4c31-bc90-73af11c879e7\") " pod="openstack/dnsmasq-dns-666b6646f7-jc5w2" Mar 19 15:34:38 crc kubenswrapper[4771]: I0319 15:34:38.991227 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/511c87f1-6ccf-4c31-bc90-73af11c879e7-config\") pod \"dnsmasq-dns-666b6646f7-jc5w2\" (UID: \"511c87f1-6ccf-4c31-bc90-73af11c879e7\") " pod="openstack/dnsmasq-dns-666b6646f7-jc5w2" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.006723 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vj2zl"] Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.022889 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw25d\" (UniqueName: \"kubernetes.io/projected/511c87f1-6ccf-4c31-bc90-73af11c879e7-kube-api-access-nw25d\") pod \"dnsmasq-dns-666b6646f7-jc5w2\" (UID: \"511c87f1-6ccf-4c31-bc90-73af11c879e7\") " pod="openstack/dnsmasq-dns-666b6646f7-jc5w2" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.035206 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-zfhtw"] Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.036726 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-zfhtw" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.051896 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-zfhtw"] Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.113481 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-jc5w2" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.191688 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14a12c4b-9c29-45de-81ad-cc71b5235052-config\") pod \"dnsmasq-dns-57d769cc4f-zfhtw\" (UID: \"14a12c4b-9c29-45de-81ad-cc71b5235052\") " pod="openstack/dnsmasq-dns-57d769cc4f-zfhtw" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.191769 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7xg5\" (UniqueName: \"kubernetes.io/projected/14a12c4b-9c29-45de-81ad-cc71b5235052-kube-api-access-d7xg5\") pod \"dnsmasq-dns-57d769cc4f-zfhtw\" (UID: \"14a12c4b-9c29-45de-81ad-cc71b5235052\") " pod="openstack/dnsmasq-dns-57d769cc4f-zfhtw" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.191894 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14a12c4b-9c29-45de-81ad-cc71b5235052-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-zfhtw\" (UID: \"14a12c4b-9c29-45de-81ad-cc71b5235052\") " pod="openstack/dnsmasq-dns-57d769cc4f-zfhtw" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.293027 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14a12c4b-9c29-45de-81ad-cc71b5235052-config\") pod \"dnsmasq-dns-57d769cc4f-zfhtw\" (UID: \"14a12c4b-9c29-45de-81ad-cc71b5235052\") " pod="openstack/dnsmasq-dns-57d769cc4f-zfhtw" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.293079 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7xg5\" (UniqueName: \"kubernetes.io/projected/14a12c4b-9c29-45de-81ad-cc71b5235052-kube-api-access-d7xg5\") pod \"dnsmasq-dns-57d769cc4f-zfhtw\" (UID: \"14a12c4b-9c29-45de-81ad-cc71b5235052\") " pod="openstack/dnsmasq-dns-57d769cc4f-zfhtw" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.293117 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14a12c4b-9c29-45de-81ad-cc71b5235052-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-zfhtw\" (UID: \"14a12c4b-9c29-45de-81ad-cc71b5235052\") " pod="openstack/dnsmasq-dns-57d769cc4f-zfhtw" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.294080 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14a12c4b-9c29-45de-81ad-cc71b5235052-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-zfhtw\" (UID: \"14a12c4b-9c29-45de-81ad-cc71b5235052\") " pod="openstack/dnsmasq-dns-57d769cc4f-zfhtw" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.294198 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14a12c4b-9c29-45de-81ad-cc71b5235052-config\") pod \"dnsmasq-dns-57d769cc4f-zfhtw\" (UID: \"14a12c4b-9c29-45de-81ad-cc71b5235052\") " pod="openstack/dnsmasq-dns-57d769cc4f-zfhtw" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.319508 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7xg5\" (UniqueName: \"kubernetes.io/projected/14a12c4b-9c29-45de-81ad-cc71b5235052-kube-api-access-d7xg5\") pod \"dnsmasq-dns-57d769cc4f-zfhtw\" (UID: \"14a12c4b-9c29-45de-81ad-cc71b5235052\") " pod="openstack/dnsmasq-dns-57d769cc4f-zfhtw" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.378296 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-zfhtw" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.615025 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jc5w2"] Mar 19 15:34:39 crc kubenswrapper[4771]: W0319 15:34:39.621754 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod511c87f1_6ccf_4c31_bc90_73af11c879e7.slice/crio-39ed69d7e863a0c6c2b44b606c1c5fffad966fadb0cb1bf09a77040fa217d39e WatchSource:0}: Error finding container 39ed69d7e863a0c6c2b44b606c1c5fffad966fadb0cb1bf09a77040fa217d39e: Status 404 returned error can't find the container with id 39ed69d7e863a0c6c2b44b606c1c5fffad966fadb0cb1bf09a77040fa217d39e Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.751547 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.753084 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.755947 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.756049 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.756177 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.755951 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.756420 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-rlbq6" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.756484 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.756896 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.760133 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-zfhtw"] Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.772256 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 15:34:39 crc kubenswrapper[4771]: W0319 15:34:39.772789 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14a12c4b_9c29_45de_81ad_cc71b5235052.slice/crio-5a1da17fc129dedd6bf28cb80892e974e2c4a7e611c31f108b8f1df27d053758 WatchSource:0}: Error finding container 5a1da17fc129dedd6bf28cb80892e974e2c4a7e611c31f108b8f1df27d053758: Status 404 returned error can't find the container with id 5a1da17fc129dedd6bf28cb80892e974e2c4a7e611c31f108b8f1df27d053758 Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.903588 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.903897 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/74c5f622-0ced-47f9-80d5-75a09acfafc0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.903950 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/74c5f622-0ced-47f9-80d5-75a09acfafc0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.904015 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/74c5f622-0ced-47f9-80d5-75a09acfafc0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.904059 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/74c5f622-0ced-47f9-80d5-75a09acfafc0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.904084 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/74c5f622-0ced-47f9-80d5-75a09acfafc0-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.904107 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/74c5f622-0ced-47f9-80d5-75a09acfafc0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.904158 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/74c5f622-0ced-47f9-80d5-75a09acfafc0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.904194 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74c5f622-0ced-47f9-80d5-75a09acfafc0-config-data\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.904217 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj8dp\" (UniqueName: \"kubernetes.io/projected/74c5f622-0ced-47f9-80d5-75a09acfafc0-kube-api-access-hj8dp\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.904245 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/74c5f622-0ced-47f9-80d5-75a09acfafc0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.965805 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.967278 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.974486 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.974525 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.974816 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.978276 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.978339 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-ln6cm" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.978479 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.979257 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 15:34:39 crc kubenswrapper[4771]: I0319 15:34:39.980733 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.006220 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/74c5f622-0ced-47f9-80d5-75a09acfafc0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.006310 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.006350 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/74c5f622-0ced-47f9-80d5-75a09acfafc0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.006392 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/74c5f622-0ced-47f9-80d5-75a09acfafc0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.006455 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/74c5f622-0ced-47f9-80d5-75a09acfafc0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.006510 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/74c5f622-0ced-47f9-80d5-75a09acfafc0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.006538 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/74c5f622-0ced-47f9-80d5-75a09acfafc0-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.006584 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/74c5f622-0ced-47f9-80d5-75a09acfafc0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.006626 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/74c5f622-0ced-47f9-80d5-75a09acfafc0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.006694 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74c5f622-0ced-47f9-80d5-75a09acfafc0-config-data\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.006743 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj8dp\" (UniqueName: \"kubernetes.io/projected/74c5f622-0ced-47f9-80d5-75a09acfafc0-kube-api-access-hj8dp\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.008716 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/74c5f622-0ced-47f9-80d5-75a09acfafc0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.009252 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/74c5f622-0ced-47f9-80d5-75a09acfafc0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.009418 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.010182 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/74c5f622-0ced-47f9-80d5-75a09acfafc0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.011028 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/74c5f622-0ced-47f9-80d5-75a09acfafc0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.012089 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74c5f622-0ced-47f9-80d5-75a09acfafc0-config-data\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.017904 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/74c5f622-0ced-47f9-80d5-75a09acfafc0-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.017972 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/74c5f622-0ced-47f9-80d5-75a09acfafc0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.038449 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.047196 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj8dp\" (UniqueName: \"kubernetes.io/projected/74c5f622-0ced-47f9-80d5-75a09acfafc0-kube-api-access-hj8dp\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.048213 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/74c5f622-0ced-47f9-80d5-75a09acfafc0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.062919 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/74c5f622-0ced-47f9-80d5-75a09acfafc0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"74c5f622-0ced-47f9-80d5-75a09acfafc0\") " pod="openstack/rabbitmq-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.084908 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.108971 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c065c328-37e2-4905-9d1e-82208eab196e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.109037 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c065c328-37e2-4905-9d1e-82208eab196e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.109057 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c065c328-37e2-4905-9d1e-82208eab196e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.109202 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvwp5\" (UniqueName: \"kubernetes.io/projected/c065c328-37e2-4905-9d1e-82208eab196e-kube-api-access-pvwp5\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.109326 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c065c328-37e2-4905-9d1e-82208eab196e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.109357 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.109392 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c065c328-37e2-4905-9d1e-82208eab196e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.109434 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c065c328-37e2-4905-9d1e-82208eab196e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.109477 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c065c328-37e2-4905-9d1e-82208eab196e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.109587 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c065c328-37e2-4905-9d1e-82208eab196e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.109607 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c065c328-37e2-4905-9d1e-82208eab196e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.211575 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvwp5\" (UniqueName: \"kubernetes.io/projected/c065c328-37e2-4905-9d1e-82208eab196e-kube-api-access-pvwp5\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.211674 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c065c328-37e2-4905-9d1e-82208eab196e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.211705 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.211744 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c065c328-37e2-4905-9d1e-82208eab196e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.211784 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c065c328-37e2-4905-9d1e-82208eab196e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.211814 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c065c328-37e2-4905-9d1e-82208eab196e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.211881 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c065c328-37e2-4905-9d1e-82208eab196e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.211905 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c065c328-37e2-4905-9d1e-82208eab196e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.211939 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c065c328-37e2-4905-9d1e-82208eab196e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.211966 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c065c328-37e2-4905-9d1e-82208eab196e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.212007 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c065c328-37e2-4905-9d1e-82208eab196e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.213661 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c065c328-37e2-4905-9d1e-82208eab196e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.214803 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.215238 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c065c328-37e2-4905-9d1e-82208eab196e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.215835 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c065c328-37e2-4905-9d1e-82208eab196e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.215891 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c065c328-37e2-4905-9d1e-82208eab196e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.216111 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c065c328-37e2-4905-9d1e-82208eab196e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.218683 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c065c328-37e2-4905-9d1e-82208eab196e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.218780 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c065c328-37e2-4905-9d1e-82208eab196e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.219837 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c065c328-37e2-4905-9d1e-82208eab196e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.226956 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c065c328-37e2-4905-9d1e-82208eab196e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.248111 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvwp5\" (UniqueName: \"kubernetes.io/projected/c065c328-37e2-4905-9d1e-82208eab196e-kube-api-access-pvwp5\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.282330 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c065c328-37e2-4905-9d1e-82208eab196e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.292436 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.375561 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.461241 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-zfhtw" event={"ID":"14a12c4b-9c29-45de-81ad-cc71b5235052","Type":"ContainerStarted","Data":"5a1da17fc129dedd6bf28cb80892e974e2c4a7e611c31f108b8f1df27d053758"} Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.466438 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-jc5w2" event={"ID":"511c87f1-6ccf-4c31-bc90-73af11c879e7","Type":"ContainerStarted","Data":"39ed69d7e863a0c6c2b44b606c1c5fffad966fadb0cb1bf09a77040fa217d39e"} Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.469581 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74c5f622-0ced-47f9-80d5-75a09acfafc0","Type":"ContainerStarted","Data":"b10f5f5fc21fef7963653b51fcd65654f902b08aa1e8e9de2f883b3740a9cf73"} Mar 19 15:34:40 crc kubenswrapper[4771]: I0319 15:34:40.854251 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.370483 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.371949 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.386527 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.386770 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-52p9r" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.387174 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.388474 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.388591 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.397351 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.447505 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnztr\" (UniqueName: \"kubernetes.io/projected/66d10600-3f91-4e77-a751-7f6fbe7148ea-kube-api-access-pnztr\") pod \"openstack-galera-0\" (UID: \"66d10600-3f91-4e77-a751-7f6fbe7148ea\") " pod="openstack/openstack-galera-0" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.447578 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d10600-3f91-4e77-a751-7f6fbe7148ea-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"66d10600-3f91-4e77-a751-7f6fbe7148ea\") " pod="openstack/openstack-galera-0" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.447655 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66d10600-3f91-4e77-a751-7f6fbe7148ea-operator-scripts\") pod \"openstack-galera-0\" (UID: \"66d10600-3f91-4e77-a751-7f6fbe7148ea\") " pod="openstack/openstack-galera-0" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.447675 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/66d10600-3f91-4e77-a751-7f6fbe7148ea-config-data-default\") pod \"openstack-galera-0\" (UID: \"66d10600-3f91-4e77-a751-7f6fbe7148ea\") " pod="openstack/openstack-galera-0" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.447692 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/66d10600-3f91-4e77-a751-7f6fbe7148ea-config-data-generated\") pod \"openstack-galera-0\" (UID: \"66d10600-3f91-4e77-a751-7f6fbe7148ea\") " pod="openstack/openstack-galera-0" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.447707 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/66d10600-3f91-4e77-a751-7f6fbe7148ea-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"66d10600-3f91-4e77-a751-7f6fbe7148ea\") " pod="openstack/openstack-galera-0" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.447733 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/66d10600-3f91-4e77-a751-7f6fbe7148ea-kolla-config\") pod \"openstack-galera-0\" (UID: \"66d10600-3f91-4e77-a751-7f6fbe7148ea\") " pod="openstack/openstack-galera-0" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.447795 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"66d10600-3f91-4e77-a751-7f6fbe7148ea\") " pod="openstack/openstack-galera-0" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.550211 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnztr\" (UniqueName: \"kubernetes.io/projected/66d10600-3f91-4e77-a751-7f6fbe7148ea-kube-api-access-pnztr\") pod \"openstack-galera-0\" (UID: \"66d10600-3f91-4e77-a751-7f6fbe7148ea\") " pod="openstack/openstack-galera-0" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.550278 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d10600-3f91-4e77-a751-7f6fbe7148ea-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"66d10600-3f91-4e77-a751-7f6fbe7148ea\") " pod="openstack/openstack-galera-0" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.550308 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66d10600-3f91-4e77-a751-7f6fbe7148ea-operator-scripts\") pod \"openstack-galera-0\" (UID: \"66d10600-3f91-4e77-a751-7f6fbe7148ea\") " pod="openstack/openstack-galera-0" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.550343 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/66d10600-3f91-4e77-a751-7f6fbe7148ea-config-data-default\") pod \"openstack-galera-0\" (UID: \"66d10600-3f91-4e77-a751-7f6fbe7148ea\") " pod="openstack/openstack-galera-0" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.550363 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/66d10600-3f91-4e77-a751-7f6fbe7148ea-config-data-generated\") pod \"openstack-galera-0\" (UID: \"66d10600-3f91-4e77-a751-7f6fbe7148ea\") " pod="openstack/openstack-galera-0" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.550379 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/66d10600-3f91-4e77-a751-7f6fbe7148ea-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"66d10600-3f91-4e77-a751-7f6fbe7148ea\") " pod="openstack/openstack-galera-0" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.550418 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/66d10600-3f91-4e77-a751-7f6fbe7148ea-kolla-config\") pod \"openstack-galera-0\" (UID: \"66d10600-3f91-4e77-a751-7f6fbe7148ea\") " pod="openstack/openstack-galera-0" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.550469 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"66d10600-3f91-4e77-a751-7f6fbe7148ea\") " pod="openstack/openstack-galera-0" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.550886 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"66d10600-3f91-4e77-a751-7f6fbe7148ea\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-galera-0" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.564176 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.564373 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.568651 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.572850 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.632242 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/66d10600-3f91-4e77-a751-7f6fbe7148ea-config-data-generated\") pod \"openstack-galera-0\" (UID: \"66d10600-3f91-4e77-a751-7f6fbe7148ea\") " pod="openstack/openstack-galera-0" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.640401 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66d10600-3f91-4e77-a751-7f6fbe7148ea-operator-scripts\") pod \"openstack-galera-0\" (UID: \"66d10600-3f91-4e77-a751-7f6fbe7148ea\") " pod="openstack/openstack-galera-0" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.648436 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/66d10600-3f91-4e77-a751-7f6fbe7148ea-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"66d10600-3f91-4e77-a751-7f6fbe7148ea\") " pod="openstack/openstack-galera-0" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.648863 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/66d10600-3f91-4e77-a751-7f6fbe7148ea-kolla-config\") pod \"openstack-galera-0\" (UID: \"66d10600-3f91-4e77-a751-7f6fbe7148ea\") " pod="openstack/openstack-galera-0" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.669349 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnztr\" (UniqueName: \"kubernetes.io/projected/66d10600-3f91-4e77-a751-7f6fbe7148ea-kube-api-access-pnztr\") pod \"openstack-galera-0\" (UID: \"66d10600-3f91-4e77-a751-7f6fbe7148ea\") " pod="openstack/openstack-galera-0" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.670298 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"66d10600-3f91-4e77-a751-7f6fbe7148ea\") " pod="openstack/openstack-galera-0" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.674794 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d10600-3f91-4e77-a751-7f6fbe7148ea-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"66d10600-3f91-4e77-a751-7f6fbe7148ea\") " pod="openstack/openstack-galera-0" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.681785 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/66d10600-3f91-4e77-a751-7f6fbe7148ea-config-data-default\") pod \"openstack-galera-0\" (UID: \"66d10600-3f91-4e77-a751-7f6fbe7148ea\") " pod="openstack/openstack-galera-0" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.715451 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-52p9r" Mar 19 15:34:41 crc kubenswrapper[4771]: I0319 15:34:41.725181 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.432320 4771 scope.go:117] "RemoveContainer" containerID="3e4269d0a352b4a88f7c4b744aa306758accf3c08d237ea617934a623dc02536" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.668607 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.670528 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.673450 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.673671 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.673949 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-xgv5x" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.674031 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.679548 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.781898 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"daa4604a-2110-4000-a893-d7f308d29bce\") " pod="openstack/openstack-cell1-galera-0" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.782941 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daa4604a-2110-4000-a893-d7f308d29bce-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"daa4604a-2110-4000-a893-d7f308d29bce\") " pod="openstack/openstack-cell1-galera-0" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.782978 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/daa4604a-2110-4000-a893-d7f308d29bce-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"daa4604a-2110-4000-a893-d7f308d29bce\") " pod="openstack/openstack-cell1-galera-0" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.783026 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/daa4604a-2110-4000-a893-d7f308d29bce-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"daa4604a-2110-4000-a893-d7f308d29bce\") " pod="openstack/openstack-cell1-galera-0" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.783045 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bssdr\" (UniqueName: \"kubernetes.io/projected/daa4604a-2110-4000-a893-d7f308d29bce-kube-api-access-bssdr\") pod \"openstack-cell1-galera-0\" (UID: \"daa4604a-2110-4000-a893-d7f308d29bce\") " pod="openstack/openstack-cell1-galera-0" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.783149 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/daa4604a-2110-4000-a893-d7f308d29bce-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"daa4604a-2110-4000-a893-d7f308d29bce\") " pod="openstack/openstack-cell1-galera-0" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.783205 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/daa4604a-2110-4000-a893-d7f308d29bce-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"daa4604a-2110-4000-a893-d7f308d29bce\") " pod="openstack/openstack-cell1-galera-0" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.783238 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/daa4604a-2110-4000-a893-d7f308d29bce-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"daa4604a-2110-4000-a893-d7f308d29bce\") " pod="openstack/openstack-cell1-galera-0" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.887607 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/daa4604a-2110-4000-a893-d7f308d29bce-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"daa4604a-2110-4000-a893-d7f308d29bce\") " pod="openstack/openstack-cell1-galera-0" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.895915 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/daa4604a-2110-4000-a893-d7f308d29bce-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"daa4604a-2110-4000-a893-d7f308d29bce\") " pod="openstack/openstack-cell1-galera-0" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.896009 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/daa4604a-2110-4000-a893-d7f308d29bce-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"daa4604a-2110-4000-a893-d7f308d29bce\") " pod="openstack/openstack-cell1-galera-0" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.899957 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/daa4604a-2110-4000-a893-d7f308d29bce-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"daa4604a-2110-4000-a893-d7f308d29bce\") " pod="openstack/openstack-cell1-galera-0" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.901477 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"daa4604a-2110-4000-a893-d7f308d29bce\") " pod="openstack/openstack-cell1-galera-0" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.901627 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daa4604a-2110-4000-a893-d7f308d29bce-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"daa4604a-2110-4000-a893-d7f308d29bce\") " pod="openstack/openstack-cell1-galera-0" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.901669 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/daa4604a-2110-4000-a893-d7f308d29bce-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"daa4604a-2110-4000-a893-d7f308d29bce\") " pod="openstack/openstack-cell1-galera-0" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.901719 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/daa4604a-2110-4000-a893-d7f308d29bce-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"daa4604a-2110-4000-a893-d7f308d29bce\") " pod="openstack/openstack-cell1-galera-0" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.901742 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bssdr\" (UniqueName: \"kubernetes.io/projected/daa4604a-2110-4000-a893-d7f308d29bce-kube-api-access-bssdr\") pod \"openstack-cell1-galera-0\" (UID: \"daa4604a-2110-4000-a893-d7f308d29bce\") " pod="openstack/openstack-cell1-galera-0" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.902882 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/daa4604a-2110-4000-a893-d7f308d29bce-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"daa4604a-2110-4000-a893-d7f308d29bce\") " pod="openstack/openstack-cell1-galera-0" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.903515 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/daa4604a-2110-4000-a893-d7f308d29bce-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"daa4604a-2110-4000-a893-d7f308d29bce\") " pod="openstack/openstack-cell1-galera-0" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.904563 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"daa4604a-2110-4000-a893-d7f308d29bce\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-cell1-galera-0" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.907913 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/daa4604a-2110-4000-a893-d7f308d29bce-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"daa4604a-2110-4000-a893-d7f308d29bce\") " pod="openstack/openstack-cell1-galera-0" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.908586 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/daa4604a-2110-4000-a893-d7f308d29bce-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"daa4604a-2110-4000-a893-d7f308d29bce\") " pod="openstack/openstack-cell1-galera-0" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.926859 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bssdr\" (UniqueName: \"kubernetes.io/projected/daa4604a-2110-4000-a893-d7f308d29bce-kube-api-access-bssdr\") pod \"openstack-cell1-galera-0\" (UID: \"daa4604a-2110-4000-a893-d7f308d29bce\") " pod="openstack/openstack-cell1-galera-0" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.928921 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daa4604a-2110-4000-a893-d7f308d29bce-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"daa4604a-2110-4000-a893-d7f308d29bce\") " pod="openstack/openstack-cell1-galera-0" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.936275 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"daa4604a-2110-4000-a893-d7f308d29bce\") " pod="openstack/openstack-cell1-galera-0" Mar 19 15:34:42 crc kubenswrapper[4771]: I0319 15:34:42.987481 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 19 15:34:43 crc kubenswrapper[4771]: I0319 15:34:43.098521 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 19 15:34:43 crc kubenswrapper[4771]: I0319 15:34:43.099568 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 19 15:34:43 crc kubenswrapper[4771]: I0319 15:34:43.102253 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-xdrts" Mar 19 15:34:43 crc kubenswrapper[4771]: I0319 15:34:43.102516 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 19 15:34:43 crc kubenswrapper[4771]: I0319 15:34:43.102678 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 19 15:34:43 crc kubenswrapper[4771]: I0319 15:34:43.103522 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 19 15:34:43 crc kubenswrapper[4771]: I0319 15:34:43.234190 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxpkr\" (UniqueName: \"kubernetes.io/projected/1e06921b-f2eb-4ef0-8256-405214a269e0-kube-api-access-bxpkr\") pod \"memcached-0\" (UID: \"1e06921b-f2eb-4ef0-8256-405214a269e0\") " pod="openstack/memcached-0" Mar 19 15:34:43 crc kubenswrapper[4771]: I0319 15:34:43.234257 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1e06921b-f2eb-4ef0-8256-405214a269e0-kolla-config\") pod \"memcached-0\" (UID: \"1e06921b-f2eb-4ef0-8256-405214a269e0\") " pod="openstack/memcached-0" Mar 19 15:34:43 crc kubenswrapper[4771]: I0319 15:34:43.234346 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e06921b-f2eb-4ef0-8256-405214a269e0-config-data\") pod \"memcached-0\" (UID: \"1e06921b-f2eb-4ef0-8256-405214a269e0\") " pod="openstack/memcached-0" Mar 19 15:34:43 crc kubenswrapper[4771]: I0319 15:34:43.234369 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e06921b-f2eb-4ef0-8256-405214a269e0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1e06921b-f2eb-4ef0-8256-405214a269e0\") " pod="openstack/memcached-0" Mar 19 15:34:43 crc kubenswrapper[4771]: I0319 15:34:43.234481 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e06921b-f2eb-4ef0-8256-405214a269e0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1e06921b-f2eb-4ef0-8256-405214a269e0\") " pod="openstack/memcached-0" Mar 19 15:34:43 crc kubenswrapper[4771]: I0319 15:34:43.335912 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxpkr\" (UniqueName: \"kubernetes.io/projected/1e06921b-f2eb-4ef0-8256-405214a269e0-kube-api-access-bxpkr\") pod \"memcached-0\" (UID: \"1e06921b-f2eb-4ef0-8256-405214a269e0\") " pod="openstack/memcached-0" Mar 19 15:34:43 crc kubenswrapper[4771]: I0319 15:34:43.335963 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1e06921b-f2eb-4ef0-8256-405214a269e0-kolla-config\") pod \"memcached-0\" (UID: \"1e06921b-f2eb-4ef0-8256-405214a269e0\") " pod="openstack/memcached-0" Mar 19 15:34:43 crc kubenswrapper[4771]: I0319 15:34:43.336051 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e06921b-f2eb-4ef0-8256-405214a269e0-config-data\") pod \"memcached-0\" (UID: \"1e06921b-f2eb-4ef0-8256-405214a269e0\") " pod="openstack/memcached-0" Mar 19 15:34:43 crc kubenswrapper[4771]: I0319 15:34:43.336082 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e06921b-f2eb-4ef0-8256-405214a269e0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1e06921b-f2eb-4ef0-8256-405214a269e0\") " pod="openstack/memcached-0" Mar 19 15:34:43 crc kubenswrapper[4771]: I0319 15:34:43.336130 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e06921b-f2eb-4ef0-8256-405214a269e0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1e06921b-f2eb-4ef0-8256-405214a269e0\") " pod="openstack/memcached-0" Mar 19 15:34:43 crc kubenswrapper[4771]: I0319 15:34:43.336776 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1e06921b-f2eb-4ef0-8256-405214a269e0-kolla-config\") pod \"memcached-0\" (UID: \"1e06921b-f2eb-4ef0-8256-405214a269e0\") " pod="openstack/memcached-0" Mar 19 15:34:43 crc kubenswrapper[4771]: I0319 15:34:43.337099 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e06921b-f2eb-4ef0-8256-405214a269e0-config-data\") pod \"memcached-0\" (UID: \"1e06921b-f2eb-4ef0-8256-405214a269e0\") " pod="openstack/memcached-0" Mar 19 15:34:43 crc kubenswrapper[4771]: I0319 15:34:43.351598 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e06921b-f2eb-4ef0-8256-405214a269e0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1e06921b-f2eb-4ef0-8256-405214a269e0\") " pod="openstack/memcached-0" Mar 19 15:34:43 crc kubenswrapper[4771]: I0319 15:34:43.355766 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e06921b-f2eb-4ef0-8256-405214a269e0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1e06921b-f2eb-4ef0-8256-405214a269e0\") " pod="openstack/memcached-0" Mar 19 15:34:43 crc kubenswrapper[4771]: I0319 15:34:43.356267 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxpkr\" (UniqueName: \"kubernetes.io/projected/1e06921b-f2eb-4ef0-8256-405214a269e0-kube-api-access-bxpkr\") pod \"memcached-0\" (UID: \"1e06921b-f2eb-4ef0-8256-405214a269e0\") " pod="openstack/memcached-0" Mar 19 15:34:43 crc kubenswrapper[4771]: I0319 15:34:43.468444 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 19 15:34:45 crc kubenswrapper[4771]: I0319 15:34:45.130618 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 15:34:45 crc kubenswrapper[4771]: I0319 15:34:45.131822 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 15:34:45 crc kubenswrapper[4771]: I0319 15:34:45.133911 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-wcwlt" Mar 19 15:34:45 crc kubenswrapper[4771]: I0319 15:34:45.237457 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 15:34:45 crc kubenswrapper[4771]: I0319 15:34:45.274436 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpbdt\" (UniqueName: \"kubernetes.io/projected/49580579-6baf-4e6b-85e1-0dba0fb59d97-kube-api-access-mpbdt\") pod \"kube-state-metrics-0\" (UID: \"49580579-6baf-4e6b-85e1-0dba0fb59d97\") " pod="openstack/kube-state-metrics-0" Mar 19 15:34:45 crc kubenswrapper[4771]: I0319 15:34:45.375885 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpbdt\" (UniqueName: \"kubernetes.io/projected/49580579-6baf-4e6b-85e1-0dba0fb59d97-kube-api-access-mpbdt\") pod \"kube-state-metrics-0\" (UID: \"49580579-6baf-4e6b-85e1-0dba0fb59d97\") " pod="openstack/kube-state-metrics-0" Mar 19 15:34:45 crc kubenswrapper[4771]: I0319 15:34:45.396632 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpbdt\" (UniqueName: \"kubernetes.io/projected/49580579-6baf-4e6b-85e1-0dba0fb59d97-kube-api-access-mpbdt\") pod \"kube-state-metrics-0\" (UID: \"49580579-6baf-4e6b-85e1-0dba0fb59d97\") " pod="openstack/kube-state-metrics-0" Mar 19 15:34:45 crc kubenswrapper[4771]: I0319 15:34:45.450620 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 15:34:45 crc kubenswrapper[4771]: I0319 15:34:45.507039 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c065c328-37e2-4905-9d1e-82208eab196e","Type":"ContainerStarted","Data":"579d72c175d20c126dc2e9338dc58bd220bd769b26c5efc63c2940b6674699fe"} Mar 19 15:34:46 crc kubenswrapper[4771]: I0319 15:34:46.480277 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.379978 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-w5jsx"] Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.381184 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-w5jsx" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.383315 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-jmrkc" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.383563 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.383680 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.388341 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-w5jsx"] Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.398029 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-rs8tv"] Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.402750 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-rs8tv" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.443509 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-rs8tv"] Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.548686 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpd7m\" (UniqueName: \"kubernetes.io/projected/9af24d0b-354c-4109-a11a-2e56c65b8b0a-kube-api-access-xpd7m\") pod \"ovn-controller-ovs-rs8tv\" (UID: \"9af24d0b-354c-4109-a11a-2e56c65b8b0a\") " pod="openstack/ovn-controller-ovs-rs8tv" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.548732 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9af24d0b-354c-4109-a11a-2e56c65b8b0a-var-lib\") pod \"ovn-controller-ovs-rs8tv\" (UID: \"9af24d0b-354c-4109-a11a-2e56c65b8b0a\") " pod="openstack/ovn-controller-ovs-rs8tv" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.548770 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9af24d0b-354c-4109-a11a-2e56c65b8b0a-scripts\") pod \"ovn-controller-ovs-rs8tv\" (UID: \"9af24d0b-354c-4109-a11a-2e56c65b8b0a\") " pod="openstack/ovn-controller-ovs-rs8tv" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.548802 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9af24d0b-354c-4109-a11a-2e56c65b8b0a-etc-ovs\") pod \"ovn-controller-ovs-rs8tv\" (UID: \"9af24d0b-354c-4109-a11a-2e56c65b8b0a\") " pod="openstack/ovn-controller-ovs-rs8tv" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.548820 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9af24d0b-354c-4109-a11a-2e56c65b8b0a-var-run\") pod \"ovn-controller-ovs-rs8tv\" (UID: \"9af24d0b-354c-4109-a11a-2e56c65b8b0a\") " pod="openstack/ovn-controller-ovs-rs8tv" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.548851 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8c6ed19-f258-4da0-966a-6c538b85dce1-ovn-controller-tls-certs\") pod \"ovn-controller-w5jsx\" (UID: \"e8c6ed19-f258-4da0-966a-6c538b85dce1\") " pod="openstack/ovn-controller-w5jsx" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.548873 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8c6ed19-f258-4da0-966a-6c538b85dce1-scripts\") pod \"ovn-controller-w5jsx\" (UID: \"e8c6ed19-f258-4da0-966a-6c538b85dce1\") " pod="openstack/ovn-controller-w5jsx" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.548896 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92tj2\" (UniqueName: \"kubernetes.io/projected/e8c6ed19-f258-4da0-966a-6c538b85dce1-kube-api-access-92tj2\") pod \"ovn-controller-w5jsx\" (UID: \"e8c6ed19-f258-4da0-966a-6c538b85dce1\") " pod="openstack/ovn-controller-w5jsx" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.548912 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e8c6ed19-f258-4da0-966a-6c538b85dce1-var-run-ovn\") pod \"ovn-controller-w5jsx\" (UID: \"e8c6ed19-f258-4da0-966a-6c538b85dce1\") " pod="openstack/ovn-controller-w5jsx" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.548932 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9af24d0b-354c-4109-a11a-2e56c65b8b0a-var-log\") pod \"ovn-controller-ovs-rs8tv\" (UID: \"9af24d0b-354c-4109-a11a-2e56c65b8b0a\") " pod="openstack/ovn-controller-ovs-rs8tv" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.548953 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e8c6ed19-f258-4da0-966a-6c538b85dce1-var-run\") pod \"ovn-controller-w5jsx\" (UID: \"e8c6ed19-f258-4da0-966a-6c538b85dce1\") " pod="openstack/ovn-controller-w5jsx" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.548967 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8c6ed19-f258-4da0-966a-6c538b85dce1-combined-ca-bundle\") pod \"ovn-controller-w5jsx\" (UID: \"e8c6ed19-f258-4da0-966a-6c538b85dce1\") " pod="openstack/ovn-controller-w5jsx" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.548996 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e8c6ed19-f258-4da0-966a-6c538b85dce1-var-log-ovn\") pod \"ovn-controller-w5jsx\" (UID: \"e8c6ed19-f258-4da0-966a-6c538b85dce1\") " pod="openstack/ovn-controller-w5jsx" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.649824 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9af24d0b-354c-4109-a11a-2e56c65b8b0a-etc-ovs\") pod \"ovn-controller-ovs-rs8tv\" (UID: \"9af24d0b-354c-4109-a11a-2e56c65b8b0a\") " pod="openstack/ovn-controller-ovs-rs8tv" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.649903 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9af24d0b-354c-4109-a11a-2e56c65b8b0a-var-run\") pod \"ovn-controller-ovs-rs8tv\" (UID: \"9af24d0b-354c-4109-a11a-2e56c65b8b0a\") " pod="openstack/ovn-controller-ovs-rs8tv" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.649964 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8c6ed19-f258-4da0-966a-6c538b85dce1-ovn-controller-tls-certs\") pod \"ovn-controller-w5jsx\" (UID: \"e8c6ed19-f258-4da0-966a-6c538b85dce1\") " pod="openstack/ovn-controller-w5jsx" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.650039 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8c6ed19-f258-4da0-966a-6c538b85dce1-scripts\") pod \"ovn-controller-w5jsx\" (UID: \"e8c6ed19-f258-4da0-966a-6c538b85dce1\") " pod="openstack/ovn-controller-w5jsx" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.650092 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92tj2\" (UniqueName: \"kubernetes.io/projected/e8c6ed19-f258-4da0-966a-6c538b85dce1-kube-api-access-92tj2\") pod \"ovn-controller-w5jsx\" (UID: \"e8c6ed19-f258-4da0-966a-6c538b85dce1\") " pod="openstack/ovn-controller-w5jsx" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.650117 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e8c6ed19-f258-4da0-966a-6c538b85dce1-var-run-ovn\") pod \"ovn-controller-w5jsx\" (UID: \"e8c6ed19-f258-4da0-966a-6c538b85dce1\") " pod="openstack/ovn-controller-w5jsx" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.650150 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9af24d0b-354c-4109-a11a-2e56c65b8b0a-var-log\") pod \"ovn-controller-ovs-rs8tv\" (UID: \"9af24d0b-354c-4109-a11a-2e56c65b8b0a\") " pod="openstack/ovn-controller-ovs-rs8tv" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.650183 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e8c6ed19-f258-4da0-966a-6c538b85dce1-var-run\") pod \"ovn-controller-w5jsx\" (UID: \"e8c6ed19-f258-4da0-966a-6c538b85dce1\") " pod="openstack/ovn-controller-w5jsx" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.650202 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8c6ed19-f258-4da0-966a-6c538b85dce1-combined-ca-bundle\") pod \"ovn-controller-w5jsx\" (UID: \"e8c6ed19-f258-4da0-966a-6c538b85dce1\") " pod="openstack/ovn-controller-w5jsx" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.650225 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e8c6ed19-f258-4da0-966a-6c538b85dce1-var-log-ovn\") pod \"ovn-controller-w5jsx\" (UID: \"e8c6ed19-f258-4da0-966a-6c538b85dce1\") " pod="openstack/ovn-controller-w5jsx" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.650265 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9af24d0b-354c-4109-a11a-2e56c65b8b0a-var-lib\") pod \"ovn-controller-ovs-rs8tv\" (UID: \"9af24d0b-354c-4109-a11a-2e56c65b8b0a\") " pod="openstack/ovn-controller-ovs-rs8tv" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.650284 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpd7m\" (UniqueName: \"kubernetes.io/projected/9af24d0b-354c-4109-a11a-2e56c65b8b0a-kube-api-access-xpd7m\") pod \"ovn-controller-ovs-rs8tv\" (UID: \"9af24d0b-354c-4109-a11a-2e56c65b8b0a\") " pod="openstack/ovn-controller-ovs-rs8tv" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.650361 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9af24d0b-354c-4109-a11a-2e56c65b8b0a-scripts\") pod \"ovn-controller-ovs-rs8tv\" (UID: \"9af24d0b-354c-4109-a11a-2e56c65b8b0a\") " pod="openstack/ovn-controller-ovs-rs8tv" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.650592 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e8c6ed19-f258-4da0-966a-6c538b85dce1-var-run-ovn\") pod \"ovn-controller-w5jsx\" (UID: \"e8c6ed19-f258-4da0-966a-6c538b85dce1\") " pod="openstack/ovn-controller-w5jsx" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.650629 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9af24d0b-354c-4109-a11a-2e56c65b8b0a-var-run\") pod \"ovn-controller-ovs-rs8tv\" (UID: \"9af24d0b-354c-4109-a11a-2e56c65b8b0a\") " pod="openstack/ovn-controller-ovs-rs8tv" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.650636 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9af24d0b-354c-4109-a11a-2e56c65b8b0a-var-log\") pod \"ovn-controller-ovs-rs8tv\" (UID: \"9af24d0b-354c-4109-a11a-2e56c65b8b0a\") " pod="openstack/ovn-controller-ovs-rs8tv" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.650737 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e8c6ed19-f258-4da0-966a-6c538b85dce1-var-log-ovn\") pod \"ovn-controller-w5jsx\" (UID: \"e8c6ed19-f258-4da0-966a-6c538b85dce1\") " pod="openstack/ovn-controller-w5jsx" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.650754 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e8c6ed19-f258-4da0-966a-6c538b85dce1-var-run\") pod \"ovn-controller-w5jsx\" (UID: \"e8c6ed19-f258-4da0-966a-6c538b85dce1\") " pod="openstack/ovn-controller-w5jsx" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.650781 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9af24d0b-354c-4109-a11a-2e56c65b8b0a-var-lib\") pod \"ovn-controller-ovs-rs8tv\" (UID: \"9af24d0b-354c-4109-a11a-2e56c65b8b0a\") " pod="openstack/ovn-controller-ovs-rs8tv" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.654051 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9af24d0b-354c-4109-a11a-2e56c65b8b0a-scripts\") pod \"ovn-controller-ovs-rs8tv\" (UID: \"9af24d0b-354c-4109-a11a-2e56c65b8b0a\") " pod="openstack/ovn-controller-ovs-rs8tv" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.658401 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8c6ed19-f258-4da0-966a-6c538b85dce1-ovn-controller-tls-certs\") pod \"ovn-controller-w5jsx\" (UID: \"e8c6ed19-f258-4da0-966a-6c538b85dce1\") " pod="openstack/ovn-controller-w5jsx" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.658546 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8c6ed19-f258-4da0-966a-6c538b85dce1-combined-ca-bundle\") pod \"ovn-controller-w5jsx\" (UID: \"e8c6ed19-f258-4da0-966a-6c538b85dce1\") " pod="openstack/ovn-controller-w5jsx" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.660376 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9af24d0b-354c-4109-a11a-2e56c65b8b0a-etc-ovs\") pod \"ovn-controller-ovs-rs8tv\" (UID: \"9af24d0b-354c-4109-a11a-2e56c65b8b0a\") " pod="openstack/ovn-controller-ovs-rs8tv" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.664529 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8c6ed19-f258-4da0-966a-6c538b85dce1-scripts\") pod \"ovn-controller-w5jsx\" (UID: \"e8c6ed19-f258-4da0-966a-6c538b85dce1\") " pod="openstack/ovn-controller-w5jsx" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.667563 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92tj2\" (UniqueName: \"kubernetes.io/projected/e8c6ed19-f258-4da0-966a-6c538b85dce1-kube-api-access-92tj2\") pod \"ovn-controller-w5jsx\" (UID: \"e8c6ed19-f258-4da0-966a-6c538b85dce1\") " pod="openstack/ovn-controller-w5jsx" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.668707 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpd7m\" (UniqueName: \"kubernetes.io/projected/9af24d0b-354c-4109-a11a-2e56c65b8b0a-kube-api-access-xpd7m\") pod \"ovn-controller-ovs-rs8tv\" (UID: \"9af24d0b-354c-4109-a11a-2e56c65b8b0a\") " pod="openstack/ovn-controller-ovs-rs8tv" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.740287 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-w5jsx" Mar 19 15:34:48 crc kubenswrapper[4771]: I0319 15:34:48.747800 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-rs8tv" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.281504 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.283779 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.286713 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-plqmw" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.287091 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.287458 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.289943 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.290173 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.297734 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.465125 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"71aa1a31-80b5-40d9-9549-f12b2f0c34aa\") " pod="openstack/ovsdbserver-nb-0" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.465198 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/71aa1a31-80b5-40d9-9549-f12b2f0c34aa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"71aa1a31-80b5-40d9-9549-f12b2f0c34aa\") " pod="openstack/ovsdbserver-nb-0" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.465227 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71aa1a31-80b5-40d9-9549-f12b2f0c34aa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"71aa1a31-80b5-40d9-9549-f12b2f0c34aa\") " pod="openstack/ovsdbserver-nb-0" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.465248 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpjl2\" (UniqueName: \"kubernetes.io/projected/71aa1a31-80b5-40d9-9549-f12b2f0c34aa-kube-api-access-lpjl2\") pod \"ovsdbserver-nb-0\" (UID: \"71aa1a31-80b5-40d9-9549-f12b2f0c34aa\") " pod="openstack/ovsdbserver-nb-0" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.465300 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71aa1a31-80b5-40d9-9549-f12b2f0c34aa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"71aa1a31-80b5-40d9-9549-f12b2f0c34aa\") " pod="openstack/ovsdbserver-nb-0" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.465328 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71aa1a31-80b5-40d9-9549-f12b2f0c34aa-config\") pod \"ovsdbserver-nb-0\" (UID: \"71aa1a31-80b5-40d9-9549-f12b2f0c34aa\") " pod="openstack/ovsdbserver-nb-0" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.465358 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/71aa1a31-80b5-40d9-9549-f12b2f0c34aa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"71aa1a31-80b5-40d9-9549-f12b2f0c34aa\") " pod="openstack/ovsdbserver-nb-0" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.465390 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/71aa1a31-80b5-40d9-9549-f12b2f0c34aa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"71aa1a31-80b5-40d9-9549-f12b2f0c34aa\") " pod="openstack/ovsdbserver-nb-0" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.566470 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"71aa1a31-80b5-40d9-9549-f12b2f0c34aa\") " pod="openstack/ovsdbserver-nb-0" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.566554 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/71aa1a31-80b5-40d9-9549-f12b2f0c34aa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"71aa1a31-80b5-40d9-9549-f12b2f0c34aa\") " pod="openstack/ovsdbserver-nb-0" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.566582 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71aa1a31-80b5-40d9-9549-f12b2f0c34aa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"71aa1a31-80b5-40d9-9549-f12b2f0c34aa\") " pod="openstack/ovsdbserver-nb-0" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.566607 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpjl2\" (UniqueName: \"kubernetes.io/projected/71aa1a31-80b5-40d9-9549-f12b2f0c34aa-kube-api-access-lpjl2\") pod \"ovsdbserver-nb-0\" (UID: \"71aa1a31-80b5-40d9-9549-f12b2f0c34aa\") " pod="openstack/ovsdbserver-nb-0" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.566658 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71aa1a31-80b5-40d9-9549-f12b2f0c34aa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"71aa1a31-80b5-40d9-9549-f12b2f0c34aa\") " pod="openstack/ovsdbserver-nb-0" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.566685 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71aa1a31-80b5-40d9-9549-f12b2f0c34aa-config\") pod \"ovsdbserver-nb-0\" (UID: \"71aa1a31-80b5-40d9-9549-f12b2f0c34aa\") " pod="openstack/ovsdbserver-nb-0" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.566717 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/71aa1a31-80b5-40d9-9549-f12b2f0c34aa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"71aa1a31-80b5-40d9-9549-f12b2f0c34aa\") " pod="openstack/ovsdbserver-nb-0" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.566751 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/71aa1a31-80b5-40d9-9549-f12b2f0c34aa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"71aa1a31-80b5-40d9-9549-f12b2f0c34aa\") " pod="openstack/ovsdbserver-nb-0" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.567136 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"71aa1a31-80b5-40d9-9549-f12b2f0c34aa\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-nb-0" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.567196 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/71aa1a31-80b5-40d9-9549-f12b2f0c34aa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"71aa1a31-80b5-40d9-9549-f12b2f0c34aa\") " pod="openstack/ovsdbserver-nb-0" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.568431 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71aa1a31-80b5-40d9-9549-f12b2f0c34aa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"71aa1a31-80b5-40d9-9549-f12b2f0c34aa\") " pod="openstack/ovsdbserver-nb-0" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.568839 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71aa1a31-80b5-40d9-9549-f12b2f0c34aa-config\") pod \"ovsdbserver-nb-0\" (UID: \"71aa1a31-80b5-40d9-9549-f12b2f0c34aa\") " pod="openstack/ovsdbserver-nb-0" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.571034 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/71aa1a31-80b5-40d9-9549-f12b2f0c34aa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"71aa1a31-80b5-40d9-9549-f12b2f0c34aa\") " pod="openstack/ovsdbserver-nb-0" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.571412 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/71aa1a31-80b5-40d9-9549-f12b2f0c34aa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"71aa1a31-80b5-40d9-9549-f12b2f0c34aa\") " pod="openstack/ovsdbserver-nb-0" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.571522 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71aa1a31-80b5-40d9-9549-f12b2f0c34aa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"71aa1a31-80b5-40d9-9549-f12b2f0c34aa\") " pod="openstack/ovsdbserver-nb-0" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.581926 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpjl2\" (UniqueName: \"kubernetes.io/projected/71aa1a31-80b5-40d9-9549-f12b2f0c34aa-kube-api-access-lpjl2\") pod \"ovsdbserver-nb-0\" (UID: \"71aa1a31-80b5-40d9-9549-f12b2f0c34aa\") " pod="openstack/ovsdbserver-nb-0" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.590554 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"71aa1a31-80b5-40d9-9549-f12b2f0c34aa\") " pod="openstack/ovsdbserver-nb-0" Mar 19 15:34:49 crc kubenswrapper[4771]: I0319 15:34:49.606445 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.093043 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.568843 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"66d10600-3f91-4e77-a751-7f6fbe7148ea","Type":"ContainerStarted","Data":"6b72695e945184d854aa8d3d12c4deea55a4ed4454f6908caaacbb2f69e4d7a8"} Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.583192 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.584458 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.588342 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-klghp" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.588740 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.588934 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.589716 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.607225 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.709293 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f\") " pod="openstack/ovsdbserver-sb-0" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.709348 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f\") " pod="openstack/ovsdbserver-sb-0" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.709383 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f87j\" (UniqueName: \"kubernetes.io/projected/7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f-kube-api-access-9f87j\") pod \"ovsdbserver-sb-0\" (UID: \"7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f\") " pod="openstack/ovsdbserver-sb-0" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.709404 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f\") " pod="openstack/ovsdbserver-sb-0" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.709427 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f\") " pod="openstack/ovsdbserver-sb-0" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.709444 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f\") " pod="openstack/ovsdbserver-sb-0" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.709462 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f\") " pod="openstack/ovsdbserver-sb-0" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.709478 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f-config\") pod \"ovsdbserver-sb-0\" (UID: \"7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f\") " pod="openstack/ovsdbserver-sb-0" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.813481 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f87j\" (UniqueName: \"kubernetes.io/projected/7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f-kube-api-access-9f87j\") pod \"ovsdbserver-sb-0\" (UID: \"7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f\") " pod="openstack/ovsdbserver-sb-0" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.813536 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f\") " pod="openstack/ovsdbserver-sb-0" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.813566 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f\") " pod="openstack/ovsdbserver-sb-0" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.813587 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f\") " pod="openstack/ovsdbserver-sb-0" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.813629 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f\") " pod="openstack/ovsdbserver-sb-0" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.813649 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f-config\") pod \"ovsdbserver-sb-0\" (UID: \"7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f\") " pod="openstack/ovsdbserver-sb-0" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.813764 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f\") " pod="openstack/ovsdbserver-sb-0" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.813804 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f\") " pod="openstack/ovsdbserver-sb-0" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.814234 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-sb-0" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.816050 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f\") " pod="openstack/ovsdbserver-sb-0" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.816389 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f-config\") pod \"ovsdbserver-sb-0\" (UID: \"7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f\") " pod="openstack/ovsdbserver-sb-0" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.817121 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f\") " pod="openstack/ovsdbserver-sb-0" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.821398 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f\") " pod="openstack/ovsdbserver-sb-0" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.839049 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f\") " pod="openstack/ovsdbserver-sb-0" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.853743 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f87j\" (UniqueName: \"kubernetes.io/projected/7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f-kube-api-access-9f87j\") pod \"ovsdbserver-sb-0\" (UID: \"7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f\") " pod="openstack/ovsdbserver-sb-0" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.854284 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f\") " pod="openstack/ovsdbserver-sb-0" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.859728 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f\") " pod="openstack/ovsdbserver-sb-0" Mar 19 15:34:51 crc kubenswrapper[4771]: I0319 15:34:51.921795 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 19 15:34:56 crc kubenswrapper[4771]: I0319 15:34:56.159297 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-w5jsx"] Mar 19 15:34:56 crc kubenswrapper[4771]: I0319 15:34:56.424228 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 19 15:34:56 crc kubenswrapper[4771]: I0319 15:34:56.602969 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"daa4604a-2110-4000-a893-d7f308d29bce","Type":"ContainerStarted","Data":"9e654b17619cc69dc42ce48a6ce1aeb8019ba928ee1cae068e7693035c182883"} Mar 19 15:34:56 crc kubenswrapper[4771]: W0319 15:34:56.832917 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e06921b_f2eb_4ef0_8256_405214a269e0.slice/crio-286008db80c6a08e0bae46e795a83a2e15adbd9fb037ed12beb90d34cc256c5f WatchSource:0}: Error finding container 286008db80c6a08e0bae46e795a83a2e15adbd9fb037ed12beb90d34cc256c5f: Status 404 returned error can't find the container with id 286008db80c6a08e0bae46e795a83a2e15adbd9fb037ed12beb90d34cc256c5f Mar 19 15:34:56 crc kubenswrapper[4771]: E0319 15:34:56.914437 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 19 15:34:56 crc kubenswrapper[4771]: E0319 15:34:56.914613 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k5lqk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-vj2zl_openstack(07e54df0-321a-4a59-800e-1bdfdb4b6eaf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 15:34:56 crc kubenswrapper[4771]: E0319 15:34:56.916273 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-vj2zl" podUID="07e54df0-321a-4a59-800e-1bdfdb4b6eaf" Mar 19 15:34:56 crc kubenswrapper[4771]: E0319 15:34:56.961425 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 19 15:34:56 crc kubenswrapper[4771]: E0319 15:34:56.961768 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bgp25,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-hrhlg_openstack(62ca8fe5-562c-49e7-b6ea-018eb0bf9739): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 15:34:56 crc kubenswrapper[4771]: E0319 15:34:56.963068 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-hrhlg" podUID="62ca8fe5-562c-49e7-b6ea-018eb0bf9739" Mar 19 15:34:56 crc kubenswrapper[4771]: E0319 15:34:56.967200 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 19 15:34:56 crc kubenswrapper[4771]: E0319 15:34:56.967335 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nw25d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-jc5w2_openstack(511c87f1-6ccf-4c31-bc90-73af11c879e7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 15:34:56 crc kubenswrapper[4771]: E0319 15:34:56.968504 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-jc5w2" podUID="511c87f1-6ccf-4c31-bc90-73af11c879e7" Mar 19 15:34:56 crc kubenswrapper[4771]: E0319 15:34:56.968691 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 19 15:34:56 crc kubenswrapper[4771]: E0319 15:34:56.969081 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d7xg5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-zfhtw_openstack(14a12c4b-9c29-45de-81ad-cc71b5235052): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 15:34:56 crc kubenswrapper[4771]: E0319 15:34:56.971822 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-zfhtw" podUID="14a12c4b-9c29-45de-81ad-cc71b5235052" Mar 19 15:34:57 crc kubenswrapper[4771]: I0319 15:34:57.287796 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 15:34:57 crc kubenswrapper[4771]: W0319 15:34:57.297140 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49580579_6baf_4e6b_85e1_0dba0fb59d97.slice/crio-5bb75061f10e26c1bc3c289592ef0da1178d2e681a79273c4921e820a978ff30 WatchSource:0}: Error finding container 5bb75061f10e26c1bc3c289592ef0da1178d2e681a79273c4921e820a978ff30: Status 404 returned error can't find the container with id 5bb75061f10e26c1bc3c289592ef0da1178d2e681a79273c4921e820a978ff30 Mar 19 15:34:57 crc kubenswrapper[4771]: I0319 15:34:57.570782 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 19 15:34:57 crc kubenswrapper[4771]: I0319 15:34:57.613536 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1e06921b-f2eb-4ef0-8256-405214a269e0","Type":"ContainerStarted","Data":"286008db80c6a08e0bae46e795a83a2e15adbd9fb037ed12beb90d34cc256c5f"} Mar 19 15:34:57 crc kubenswrapper[4771]: I0319 15:34:57.615318 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"49580579-6baf-4e6b-85e1-0dba0fb59d97","Type":"ContainerStarted","Data":"5bb75061f10e26c1bc3c289592ef0da1178d2e681a79273c4921e820a978ff30"} Mar 19 15:34:57 crc kubenswrapper[4771]: I0319 15:34:57.616710 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-w5jsx" event={"ID":"e8c6ed19-f258-4da0-966a-6c538b85dce1","Type":"ContainerStarted","Data":"8ca08e3adff0c5dde72d62d730565b277ae97e8acd1a83cd16cb49ac0a7016e1"} Mar 19 15:34:57 crc kubenswrapper[4771]: E0319 15:34:57.618969 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-zfhtw" podUID="14a12c4b-9c29-45de-81ad-cc71b5235052" Mar 19 15:34:57 crc kubenswrapper[4771]: E0319 15:34:57.619016 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-jc5w2" podUID="511c87f1-6ccf-4c31-bc90-73af11c879e7" Mar 19 15:34:58 crc kubenswrapper[4771]: I0319 15:34:58.066168 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-vj2zl" Mar 19 15:34:58 crc kubenswrapper[4771]: W0319 15:34:58.140562 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e1c08cf_2d38_438c_b2b0_6e5b4f0b728f.slice/crio-68ab1fba0859988258ffe0748855ad328979283c8a88842e0ca934fe3ea00b1c WatchSource:0}: Error finding container 68ab1fba0859988258ffe0748855ad328979283c8a88842e0ca934fe3ea00b1c: Status 404 returned error can't find the container with id 68ab1fba0859988258ffe0748855ad328979283c8a88842e0ca934fe3ea00b1c Mar 19 15:34:58 crc kubenswrapper[4771]: I0319 15:34:58.140763 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 19 15:34:58 crc kubenswrapper[4771]: I0319 15:34:58.151576 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-hrhlg" Mar 19 15:34:58 crc kubenswrapper[4771]: I0319 15:34:58.224499 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62ca8fe5-562c-49e7-b6ea-018eb0bf9739-config\") pod \"62ca8fe5-562c-49e7-b6ea-018eb0bf9739\" (UID: \"62ca8fe5-562c-49e7-b6ea-018eb0bf9739\") " Mar 19 15:34:58 crc kubenswrapper[4771]: I0319 15:34:58.224556 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07e54df0-321a-4a59-800e-1bdfdb4b6eaf-dns-svc\") pod \"07e54df0-321a-4a59-800e-1bdfdb4b6eaf\" (UID: \"07e54df0-321a-4a59-800e-1bdfdb4b6eaf\") " Mar 19 15:34:58 crc kubenswrapper[4771]: I0319 15:34:58.224642 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5lqk\" (UniqueName: \"kubernetes.io/projected/07e54df0-321a-4a59-800e-1bdfdb4b6eaf-kube-api-access-k5lqk\") pod \"07e54df0-321a-4a59-800e-1bdfdb4b6eaf\" (UID: \"07e54df0-321a-4a59-800e-1bdfdb4b6eaf\") " Mar 19 15:34:58 crc kubenswrapper[4771]: I0319 15:34:58.224669 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07e54df0-321a-4a59-800e-1bdfdb4b6eaf-config\") pod \"07e54df0-321a-4a59-800e-1bdfdb4b6eaf\" (UID: \"07e54df0-321a-4a59-800e-1bdfdb4b6eaf\") " Mar 19 15:34:58 crc kubenswrapper[4771]: I0319 15:34:58.224793 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgp25\" (UniqueName: \"kubernetes.io/projected/62ca8fe5-562c-49e7-b6ea-018eb0bf9739-kube-api-access-bgp25\") pod \"62ca8fe5-562c-49e7-b6ea-018eb0bf9739\" (UID: \"62ca8fe5-562c-49e7-b6ea-018eb0bf9739\") " Mar 19 15:34:58 crc kubenswrapper[4771]: I0319 15:34:58.225504 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62ca8fe5-562c-49e7-b6ea-018eb0bf9739-config" (OuterVolumeSpecName: "config") pod "62ca8fe5-562c-49e7-b6ea-018eb0bf9739" (UID: "62ca8fe5-562c-49e7-b6ea-018eb0bf9739"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:34:58 crc kubenswrapper[4771]: I0319 15:34:58.225668 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07e54df0-321a-4a59-800e-1bdfdb4b6eaf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "07e54df0-321a-4a59-800e-1bdfdb4b6eaf" (UID: "07e54df0-321a-4a59-800e-1bdfdb4b6eaf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:34:58 crc kubenswrapper[4771]: I0319 15:34:58.225684 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07e54df0-321a-4a59-800e-1bdfdb4b6eaf-config" (OuterVolumeSpecName: "config") pod "07e54df0-321a-4a59-800e-1bdfdb4b6eaf" (UID: "07e54df0-321a-4a59-800e-1bdfdb4b6eaf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:34:58 crc kubenswrapper[4771]: I0319 15:34:58.231041 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62ca8fe5-562c-49e7-b6ea-018eb0bf9739-kube-api-access-bgp25" (OuterVolumeSpecName: "kube-api-access-bgp25") pod "62ca8fe5-562c-49e7-b6ea-018eb0bf9739" (UID: "62ca8fe5-562c-49e7-b6ea-018eb0bf9739"). InnerVolumeSpecName "kube-api-access-bgp25". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:34:58 crc kubenswrapper[4771]: I0319 15:34:58.231567 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07e54df0-321a-4a59-800e-1bdfdb4b6eaf-kube-api-access-k5lqk" (OuterVolumeSpecName: "kube-api-access-k5lqk") pod "07e54df0-321a-4a59-800e-1bdfdb4b6eaf" (UID: "07e54df0-321a-4a59-800e-1bdfdb4b6eaf"). InnerVolumeSpecName "kube-api-access-k5lqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:34:58 crc kubenswrapper[4771]: I0319 15:34:58.326657 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgp25\" (UniqueName: \"kubernetes.io/projected/62ca8fe5-562c-49e7-b6ea-018eb0bf9739-kube-api-access-bgp25\") on node \"crc\" DevicePath \"\"" Mar 19 15:34:58 crc kubenswrapper[4771]: I0319 15:34:58.326690 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62ca8fe5-562c-49e7-b6ea-018eb0bf9739-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:34:58 crc kubenswrapper[4771]: I0319 15:34:58.326703 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07e54df0-321a-4a59-800e-1bdfdb4b6eaf-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 15:34:58 crc kubenswrapper[4771]: I0319 15:34:58.326713 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5lqk\" (UniqueName: \"kubernetes.io/projected/07e54df0-321a-4a59-800e-1bdfdb4b6eaf-kube-api-access-k5lqk\") on node \"crc\" DevicePath \"\"" Mar 19 15:34:58 crc kubenswrapper[4771]: I0319 15:34:58.326721 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07e54df0-321a-4a59-800e-1bdfdb4b6eaf-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:34:58 crc kubenswrapper[4771]: I0319 15:34:58.613404 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-rs8tv"] Mar 19 15:34:58 crc kubenswrapper[4771]: W0319 15:34:58.626625 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9af24d0b_354c_4109_a11a_2e56c65b8b0a.slice/crio-3237cdac7f08055ccd8947272437390cbb03368a755184fed85c9e4aa33861a1 WatchSource:0}: Error finding container 3237cdac7f08055ccd8947272437390cbb03368a755184fed85c9e4aa33861a1: Status 404 returned error can't find the container with id 3237cdac7f08055ccd8947272437390cbb03368a755184fed85c9e4aa33861a1 Mar 19 15:34:58 crc kubenswrapper[4771]: I0319 15:34:58.626727 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f","Type":"ContainerStarted","Data":"68ab1fba0859988258ffe0748855ad328979283c8a88842e0ca934fe3ea00b1c"} Mar 19 15:34:58 crc kubenswrapper[4771]: I0319 15:34:58.628052 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-hrhlg" Mar 19 15:34:58 crc kubenswrapper[4771]: I0319 15:34:58.628043 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-hrhlg" event={"ID":"62ca8fe5-562c-49e7-b6ea-018eb0bf9739","Type":"ContainerDied","Data":"ee691fb9efcba437180f6d47f3dd10dc9469d8e85340631ec305c33a80a1f5a4"} Mar 19 15:34:58 crc kubenswrapper[4771]: I0319 15:34:58.633277 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-vj2zl" Mar 19 15:34:58 crc kubenswrapper[4771]: I0319 15:34:58.633279 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-vj2zl" event={"ID":"07e54df0-321a-4a59-800e-1bdfdb4b6eaf","Type":"ContainerDied","Data":"1b21cfcad079ee83f28efa1ab8c308841fd6e5abb050c66aaeb0ff8233b3c188"} Mar 19 15:34:58 crc kubenswrapper[4771]: I0319 15:34:58.639407 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"71aa1a31-80b5-40d9-9549-f12b2f0c34aa","Type":"ContainerStarted","Data":"b4dd0a340360e6388a6ac3d1cd5b22032d306df4b98eb351d963f7d234c328c4"} Mar 19 15:34:58 crc kubenswrapper[4771]: I0319 15:34:58.700428 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-hrhlg"] Mar 19 15:34:58 crc kubenswrapper[4771]: I0319 15:34:58.716220 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-hrhlg"] Mar 19 15:34:58 crc kubenswrapper[4771]: I0319 15:34:58.733025 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vj2zl"] Mar 19 15:34:58 crc kubenswrapper[4771]: I0319 15:34:58.743641 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vj2zl"] Mar 19 15:34:59 crc kubenswrapper[4771]: I0319 15:34:59.517671 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07e54df0-321a-4a59-800e-1bdfdb4b6eaf" path="/var/lib/kubelet/pods/07e54df0-321a-4a59-800e-1bdfdb4b6eaf/volumes" Mar 19 15:34:59 crc kubenswrapper[4771]: I0319 15:34:59.518184 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62ca8fe5-562c-49e7-b6ea-018eb0bf9739" path="/var/lib/kubelet/pods/62ca8fe5-562c-49e7-b6ea-018eb0bf9739/volumes" Mar 19 15:34:59 crc kubenswrapper[4771]: I0319 15:34:59.646391 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rs8tv" event={"ID":"9af24d0b-354c-4109-a11a-2e56c65b8b0a","Type":"ContainerStarted","Data":"3237cdac7f08055ccd8947272437390cbb03368a755184fed85c9e4aa33861a1"} Mar 19 15:35:00 crc kubenswrapper[4771]: I0319 15:35:00.654732 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c065c328-37e2-4905-9d1e-82208eab196e","Type":"ContainerStarted","Data":"3846aac8cb06f5b9e133188e72469bd0f51a9cfb84c06e142e5eaccd41e6326d"} Mar 19 15:35:00 crc kubenswrapper[4771]: I0319 15:35:00.658496 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74c5f622-0ced-47f9-80d5-75a09acfafc0","Type":"ContainerStarted","Data":"8d82de8c0a9a55c60c139dfff637c54b671c7709788c1b5b12ec26d65e83f90e"} Mar 19 15:35:07 crc kubenswrapper[4771]: I0319 15:35:07.742094 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1e06921b-f2eb-4ef0-8256-405214a269e0","Type":"ContainerStarted","Data":"c723a9d6c7a4a677f553105d420c68524cd16a7d1b79b0cba25ccf936eb120f1"} Mar 19 15:35:07 crc kubenswrapper[4771]: I0319 15:35:07.742603 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 19 15:35:07 crc kubenswrapper[4771]: I0319 15:35:07.749659 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"49580579-6baf-4e6b-85e1-0dba0fb59d97","Type":"ContainerStarted","Data":"44226f5fb8d2fc9220267c9d0bb8e088337bbab1b6cafa4247ed99801a50571c"} Mar 19 15:35:07 crc kubenswrapper[4771]: I0319 15:35:07.749816 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 19 15:35:07 crc kubenswrapper[4771]: I0319 15:35:07.751016 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f","Type":"ContainerStarted","Data":"72668e387df6bd211878aa53ac611d5bad886b56d7fa25dbd6538aae256ed953"} Mar 19 15:35:07 crc kubenswrapper[4771]: I0319 15:35:07.752311 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rs8tv" event={"ID":"9af24d0b-354c-4109-a11a-2e56c65b8b0a","Type":"ContainerStarted","Data":"fae3381544895b0ac5bdf49244eec96b5ffea3c1295fac0cf9dd195075378d1e"} Mar 19 15:35:07 crc kubenswrapper[4771]: I0319 15:35:07.754967 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-w5jsx" event={"ID":"e8c6ed19-f258-4da0-966a-6c538b85dce1","Type":"ContainerStarted","Data":"89806d678097e7fcb8071813271d716ab64e0eb9f701328360882c4c46e99776"} Mar 19 15:35:07 crc kubenswrapper[4771]: I0319 15:35:07.755114 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-w5jsx" Mar 19 15:35:07 crc kubenswrapper[4771]: I0319 15:35:07.756535 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"daa4604a-2110-4000-a893-d7f308d29bce","Type":"ContainerStarted","Data":"5bbcbf5c6b12463e03328b2c22f46fe73a532ac6fe596a5c08a4f8025152e1dc"} Mar 19 15:35:07 crc kubenswrapper[4771]: I0319 15:35:07.760477 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"66d10600-3f91-4e77-a751-7f6fbe7148ea","Type":"ContainerStarted","Data":"94fc24ac37b251eed1fcec21fe5aa4edd10aa62b3b0773b4f58d28720fc9694d"} Mar 19 15:35:07 crc kubenswrapper[4771]: I0319 15:35:07.769166 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=15.345471651 podStartE2EDuration="24.769146538s" podCreationTimestamp="2026-03-19 15:34:43 +0000 UTC" firstStartedPulling="2026-03-19 15:34:56.843606357 +0000 UTC m=+1156.072227569" lastFinishedPulling="2026-03-19 15:35:06.267281254 +0000 UTC m=+1165.495902456" observedRunningTime="2026-03-19 15:35:07.766191757 +0000 UTC m=+1166.994812959" watchObservedRunningTime="2026-03-19 15:35:07.769146538 +0000 UTC m=+1166.997767740" Mar 19 15:35:07 crc kubenswrapper[4771]: I0319 15:35:07.770182 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"71aa1a31-80b5-40d9-9549-f12b2f0c34aa","Type":"ContainerStarted","Data":"927bc092380eade9717c706d56dd392a88b15da344c5720f316f7b2559eeb3a8"} Mar 19 15:35:07 crc kubenswrapper[4771]: I0319 15:35:07.783212 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-w5jsx" podStartSLOduration=10.139468146 podStartE2EDuration="19.783194514s" podCreationTimestamp="2026-03-19 15:34:48 +0000 UTC" firstStartedPulling="2026-03-19 15:34:56.846179869 +0000 UTC m=+1156.074801081" lastFinishedPulling="2026-03-19 15:35:06.489906247 +0000 UTC m=+1165.718527449" observedRunningTime="2026-03-19 15:35:07.780996752 +0000 UTC m=+1167.009617954" watchObservedRunningTime="2026-03-19 15:35:07.783194514 +0000 UTC m=+1167.011815716" Mar 19 15:35:07 crc kubenswrapper[4771]: I0319 15:35:07.823155 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=12.756974584 podStartE2EDuration="22.823140671s" podCreationTimestamp="2026-03-19 15:34:45 +0000 UTC" firstStartedPulling="2026-03-19 15:34:57.298835651 +0000 UTC m=+1156.527456873" lastFinishedPulling="2026-03-19 15:35:07.365001758 +0000 UTC m=+1166.593622960" observedRunningTime="2026-03-19 15:35:07.822915506 +0000 UTC m=+1167.051536708" watchObservedRunningTime="2026-03-19 15:35:07.823140671 +0000 UTC m=+1167.051761863" Mar 19 15:35:08 crc kubenswrapper[4771]: I0319 15:35:08.782733 4771 generic.go:334] "Generic (PLEG): container finished" podID="9af24d0b-354c-4109-a11a-2e56c65b8b0a" containerID="fae3381544895b0ac5bdf49244eec96b5ffea3c1295fac0cf9dd195075378d1e" exitCode=0 Mar 19 15:35:08 crc kubenswrapper[4771]: I0319 15:35:08.782929 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rs8tv" event={"ID":"9af24d0b-354c-4109-a11a-2e56c65b8b0a","Type":"ContainerDied","Data":"fae3381544895b0ac5bdf49244eec96b5ffea3c1295fac0cf9dd195075378d1e"} Mar 19 15:35:09 crc kubenswrapper[4771]: I0319 15:35:09.804149 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rs8tv" event={"ID":"9af24d0b-354c-4109-a11a-2e56c65b8b0a","Type":"ContainerStarted","Data":"87a4eea9eca38606710f52d79c8dd541e83435518dbfdc28ddbdcb24d6323d0a"} Mar 19 15:35:09 crc kubenswrapper[4771]: I0319 15:35:09.804488 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rs8tv" event={"ID":"9af24d0b-354c-4109-a11a-2e56c65b8b0a","Type":"ContainerStarted","Data":"efe6198b85d1102b6b865781439cc52d1b3eab674892d251a130175fbc1cb4b9"} Mar 19 15:35:09 crc kubenswrapper[4771]: I0319 15:35:09.804542 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-rs8tv" Mar 19 15:35:09 crc kubenswrapper[4771]: I0319 15:35:09.805598 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-rs8tv" Mar 19 15:35:09 crc kubenswrapper[4771]: I0319 15:35:09.826928 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-rs8tv" podStartSLOduration=14.10103026 podStartE2EDuration="21.826909118s" podCreationTimestamp="2026-03-19 15:34:48 +0000 UTC" firstStartedPulling="2026-03-19 15:34:58.628152723 +0000 UTC m=+1157.856773925" lastFinishedPulling="2026-03-19 15:35:06.354031581 +0000 UTC m=+1165.582652783" observedRunningTime="2026-03-19 15:35:09.824851548 +0000 UTC m=+1169.053472760" watchObservedRunningTime="2026-03-19 15:35:09.826909118 +0000 UTC m=+1169.055530330" Mar 19 15:35:12 crc kubenswrapper[4771]: I0319 15:35:12.833187 4771 generic.go:334] "Generic (PLEG): container finished" podID="14a12c4b-9c29-45de-81ad-cc71b5235052" containerID="59d6e60a4c705ad336c739aa579f9ae6eab69cb252b08d75aebdfabd91f9fa83" exitCode=0 Mar 19 15:35:12 crc kubenswrapper[4771]: I0319 15:35:12.833250 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-zfhtw" event={"ID":"14a12c4b-9c29-45de-81ad-cc71b5235052","Type":"ContainerDied","Data":"59d6e60a4c705ad336c739aa579f9ae6eab69cb252b08d75aebdfabd91f9fa83"} Mar 19 15:35:12 crc kubenswrapper[4771]: I0319 15:35:12.836022 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f","Type":"ContainerStarted","Data":"bc35ff3f2bef12f36b597ccbdfa0159e85810b73e8bc9d42133faa4113802903"} Mar 19 15:35:12 crc kubenswrapper[4771]: I0319 15:35:12.838186 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"71aa1a31-80b5-40d9-9549-f12b2f0c34aa","Type":"ContainerStarted","Data":"d5e9e59b052d04d33b67ad6500a240ec138e45f0bc82459223dc01cafd805a04"} Mar 19 15:35:12 crc kubenswrapper[4771]: I0319 15:35:12.896605 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=10.403186104 podStartE2EDuration="24.896558975s" podCreationTimestamp="2026-03-19 15:34:48 +0000 UTC" firstStartedPulling="2026-03-19 15:34:57.610242611 +0000 UTC m=+1156.838863803" lastFinishedPulling="2026-03-19 15:35:12.103615472 +0000 UTC m=+1171.332236674" observedRunningTime="2026-03-19 15:35:12.885790688 +0000 UTC m=+1172.114411930" watchObservedRunningTime="2026-03-19 15:35:12.896558975 +0000 UTC m=+1172.125180187" Mar 19 15:35:12 crc kubenswrapper[4771]: I0319 15:35:12.922687 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 19 15:35:12 crc kubenswrapper[4771]: I0319 15:35:12.938129 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=8.996089236 podStartE2EDuration="22.938106311s" podCreationTimestamp="2026-03-19 15:34:50 +0000 UTC" firstStartedPulling="2026-03-19 15:34:58.143576795 +0000 UTC m=+1157.372197997" lastFinishedPulling="2026-03-19 15:35:12.08559383 +0000 UTC m=+1171.314215072" observedRunningTime="2026-03-19 15:35:12.937549888 +0000 UTC m=+1172.166171100" watchObservedRunningTime="2026-03-19 15:35:12.938106311 +0000 UTC m=+1172.166727533" Mar 19 15:35:12 crc kubenswrapper[4771]: I0319 15:35:12.985302 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 19 15:35:13 crc kubenswrapper[4771]: I0319 15:35:13.470669 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 19 15:35:13 crc kubenswrapper[4771]: I0319 15:35:13.607414 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 19 15:35:13 crc kubenswrapper[4771]: I0319 15:35:13.648184 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 19 15:35:13 crc kubenswrapper[4771]: I0319 15:35:13.845717 4771 generic.go:334] "Generic (PLEG): container finished" podID="daa4604a-2110-4000-a893-d7f308d29bce" containerID="5bbcbf5c6b12463e03328b2c22f46fe73a532ac6fe596a5c08a4f8025152e1dc" exitCode=0 Mar 19 15:35:13 crc kubenswrapper[4771]: I0319 15:35:13.845768 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"daa4604a-2110-4000-a893-d7f308d29bce","Type":"ContainerDied","Data":"5bbcbf5c6b12463e03328b2c22f46fe73a532ac6fe596a5c08a4f8025152e1dc"} Mar 19 15:35:13 crc kubenswrapper[4771]: I0319 15:35:13.848403 4771 generic.go:334] "Generic (PLEG): container finished" podID="511c87f1-6ccf-4c31-bc90-73af11c879e7" containerID="09722ed0a5fd8bb5caa5f33e2f4230a3dcf5d13292355d4961ff4e5e26700900" exitCode=0 Mar 19 15:35:13 crc kubenswrapper[4771]: I0319 15:35:13.848511 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-jc5w2" event={"ID":"511c87f1-6ccf-4c31-bc90-73af11c879e7","Type":"ContainerDied","Data":"09722ed0a5fd8bb5caa5f33e2f4230a3dcf5d13292355d4961ff4e5e26700900"} Mar 19 15:35:13 crc kubenswrapper[4771]: I0319 15:35:13.850471 4771 generic.go:334] "Generic (PLEG): container finished" podID="66d10600-3f91-4e77-a751-7f6fbe7148ea" containerID="94fc24ac37b251eed1fcec21fe5aa4edd10aa62b3b0773b4f58d28720fc9694d" exitCode=0 Mar 19 15:35:13 crc kubenswrapper[4771]: I0319 15:35:13.850563 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"66d10600-3f91-4e77-a751-7f6fbe7148ea","Type":"ContainerDied","Data":"94fc24ac37b251eed1fcec21fe5aa4edd10aa62b3b0773b4f58d28720fc9694d"} Mar 19 15:35:13 crc kubenswrapper[4771]: I0319 15:35:13.854314 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-zfhtw" event={"ID":"14a12c4b-9c29-45de-81ad-cc71b5235052","Type":"ContainerStarted","Data":"c3e4150f84547d6bf6668ac809f295fb98c92e21403c917fa43159844e6a18c2"} Mar 19 15:35:13 crc kubenswrapper[4771]: I0319 15:35:13.855196 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 19 15:35:13 crc kubenswrapper[4771]: I0319 15:35:13.855294 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 19 15:35:13 crc kubenswrapper[4771]: I0319 15:35:13.855369 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-zfhtw" Mar 19 15:35:13 crc kubenswrapper[4771]: I0319 15:35:13.896855 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-zfhtw" podStartSLOduration=2.594414054 podStartE2EDuration="34.896839136s" podCreationTimestamp="2026-03-19 15:34:39 +0000 UTC" firstStartedPulling="2026-03-19 15:34:39.783732481 +0000 UTC m=+1139.012353683" lastFinishedPulling="2026-03-19 15:35:12.086157553 +0000 UTC m=+1171.314778765" observedRunningTime="2026-03-19 15:35:13.893788462 +0000 UTC m=+1173.122409674" watchObservedRunningTime="2026-03-19 15:35:13.896839136 +0000 UTC m=+1173.125460338" Mar 19 15:35:13 crc kubenswrapper[4771]: I0319 15:35:13.929551 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 19 15:35:13 crc kubenswrapper[4771]: I0319 15:35:13.929634 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.238380 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-zfhtw"] Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.275714 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-gtkj7"] Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.277101 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-gtkj7" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.279544 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.292229 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-gtkj7"] Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.322868 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-lrwtz"] Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.324173 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-lrwtz" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.326304 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.349216 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-lrwtz"] Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.367906 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbm8z\" (UniqueName: \"kubernetes.io/projected/def3f27c-03ff-4f92-895a-b3fb6ea64130-kube-api-access-dbm8z\") pod \"ovn-controller-metrics-lrwtz\" (UID: \"def3f27c-03ff-4f92-895a-b3fb6ea64130\") " pod="openstack/ovn-controller-metrics-lrwtz" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.368230 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d146c23d-23c2-480b-b44f-789ba9a1cfd7-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-gtkj7\" (UID: \"d146c23d-23c2-480b-b44f-789ba9a1cfd7\") " pod="openstack/dnsmasq-dns-7fd796d7df-gtkj7" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.368423 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d146c23d-23c2-480b-b44f-789ba9a1cfd7-config\") pod \"dnsmasq-dns-7fd796d7df-gtkj7\" (UID: \"d146c23d-23c2-480b-b44f-789ba9a1cfd7\") " pod="openstack/dnsmasq-dns-7fd796d7df-gtkj7" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.369873 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/def3f27c-03ff-4f92-895a-b3fb6ea64130-ovs-rundir\") pod \"ovn-controller-metrics-lrwtz\" (UID: \"def3f27c-03ff-4f92-895a-b3fb6ea64130\") " pod="openstack/ovn-controller-metrics-lrwtz" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.369951 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/def3f27c-03ff-4f92-895a-b3fb6ea64130-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-lrwtz\" (UID: \"def3f27c-03ff-4f92-895a-b3fb6ea64130\") " pod="openstack/ovn-controller-metrics-lrwtz" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.370023 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/def3f27c-03ff-4f92-895a-b3fb6ea64130-combined-ca-bundle\") pod \"ovn-controller-metrics-lrwtz\" (UID: \"def3f27c-03ff-4f92-895a-b3fb6ea64130\") " pod="openstack/ovn-controller-metrics-lrwtz" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.370114 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/def3f27c-03ff-4f92-895a-b3fb6ea64130-config\") pod \"ovn-controller-metrics-lrwtz\" (UID: \"def3f27c-03ff-4f92-895a-b3fb6ea64130\") " pod="openstack/ovn-controller-metrics-lrwtz" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.370146 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d146c23d-23c2-480b-b44f-789ba9a1cfd7-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-gtkj7\" (UID: \"d146c23d-23c2-480b-b44f-789ba9a1cfd7\") " pod="openstack/dnsmasq-dns-7fd796d7df-gtkj7" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.370176 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/def3f27c-03ff-4f92-895a-b3fb6ea64130-ovn-rundir\") pod \"ovn-controller-metrics-lrwtz\" (UID: \"def3f27c-03ff-4f92-895a-b3fb6ea64130\") " pod="openstack/ovn-controller-metrics-lrwtz" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.370240 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttrdc\" (UniqueName: \"kubernetes.io/projected/d146c23d-23c2-480b-b44f-789ba9a1cfd7-kube-api-access-ttrdc\") pod \"dnsmasq-dns-7fd796d7df-gtkj7\" (UID: \"d146c23d-23c2-480b-b44f-789ba9a1cfd7\") " pod="openstack/dnsmasq-dns-7fd796d7df-gtkj7" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.390593 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jc5w2"] Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.418469 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-429lt"] Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.420100 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-429lt" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.427300 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.467963 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.469282 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.479623 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.480455 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-q7vfg" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.480967 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.481216 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/def3f27c-03ff-4f92-895a-b3fb6ea64130-ovs-rundir\") pod \"ovn-controller-metrics-lrwtz\" (UID: \"def3f27c-03ff-4f92-895a-b3fb6ea64130\") " pod="openstack/ovn-controller-metrics-lrwtz" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.481343 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/def3f27c-03ff-4f92-895a-b3fb6ea64130-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-lrwtz\" (UID: \"def3f27c-03ff-4f92-895a-b3fb6ea64130\") " pod="openstack/ovn-controller-metrics-lrwtz" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.481374 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/def3f27c-03ff-4f92-895a-b3fb6ea64130-combined-ca-bundle\") pod \"ovn-controller-metrics-lrwtz\" (UID: \"def3f27c-03ff-4f92-895a-b3fb6ea64130\") " pod="openstack/ovn-controller-metrics-lrwtz" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.481436 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/def3f27c-03ff-4f92-895a-b3fb6ea64130-config\") pod \"ovn-controller-metrics-lrwtz\" (UID: \"def3f27c-03ff-4f92-895a-b3fb6ea64130\") " pod="openstack/ovn-controller-metrics-lrwtz" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.481462 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d146c23d-23c2-480b-b44f-789ba9a1cfd7-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-gtkj7\" (UID: \"d146c23d-23c2-480b-b44f-789ba9a1cfd7\") " pod="openstack/dnsmasq-dns-7fd796d7df-gtkj7" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.481484 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/def3f27c-03ff-4f92-895a-b3fb6ea64130-ovn-rundir\") pod \"ovn-controller-metrics-lrwtz\" (UID: \"def3f27c-03ff-4f92-895a-b3fb6ea64130\") " pod="openstack/ovn-controller-metrics-lrwtz" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.481649 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttrdc\" (UniqueName: \"kubernetes.io/projected/d146c23d-23c2-480b-b44f-789ba9a1cfd7-kube-api-access-ttrdc\") pod \"dnsmasq-dns-7fd796d7df-gtkj7\" (UID: \"d146c23d-23c2-480b-b44f-789ba9a1cfd7\") " pod="openstack/dnsmasq-dns-7fd796d7df-gtkj7" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.481744 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbm8z\" (UniqueName: \"kubernetes.io/projected/def3f27c-03ff-4f92-895a-b3fb6ea64130-kube-api-access-dbm8z\") pod \"ovn-controller-metrics-lrwtz\" (UID: \"def3f27c-03ff-4f92-895a-b3fb6ea64130\") " pod="openstack/ovn-controller-metrics-lrwtz" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.481750 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/def3f27c-03ff-4f92-895a-b3fb6ea64130-ovs-rundir\") pod \"ovn-controller-metrics-lrwtz\" (UID: \"def3f27c-03ff-4f92-895a-b3fb6ea64130\") " pod="openstack/ovn-controller-metrics-lrwtz" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.481767 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d146c23d-23c2-480b-b44f-789ba9a1cfd7-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-gtkj7\" (UID: \"d146c23d-23c2-480b-b44f-789ba9a1cfd7\") " pod="openstack/dnsmasq-dns-7fd796d7df-gtkj7" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.482050 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d146c23d-23c2-480b-b44f-789ba9a1cfd7-config\") pod \"dnsmasq-dns-7fd796d7df-gtkj7\" (UID: \"d146c23d-23c2-480b-b44f-789ba9a1cfd7\") " pod="openstack/dnsmasq-dns-7fd796d7df-gtkj7" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.482928 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d146c23d-23c2-480b-b44f-789ba9a1cfd7-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-gtkj7\" (UID: \"d146c23d-23c2-480b-b44f-789ba9a1cfd7\") " pod="openstack/dnsmasq-dns-7fd796d7df-gtkj7" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.483422 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.483605 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/def3f27c-03ff-4f92-895a-b3fb6ea64130-ovn-rundir\") pod \"ovn-controller-metrics-lrwtz\" (UID: \"def3f27c-03ff-4f92-895a-b3fb6ea64130\") " pod="openstack/ovn-controller-metrics-lrwtz" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.484505 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/def3f27c-03ff-4f92-895a-b3fb6ea64130-config\") pod \"ovn-controller-metrics-lrwtz\" (UID: \"def3f27c-03ff-4f92-895a-b3fb6ea64130\") " pod="openstack/ovn-controller-metrics-lrwtz" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.484602 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d146c23d-23c2-480b-b44f-789ba9a1cfd7-config\") pod \"dnsmasq-dns-7fd796d7df-gtkj7\" (UID: \"d146c23d-23c2-480b-b44f-789ba9a1cfd7\") " pod="openstack/dnsmasq-dns-7fd796d7df-gtkj7" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.484608 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d146c23d-23c2-480b-b44f-789ba9a1cfd7-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-gtkj7\" (UID: \"d146c23d-23c2-480b-b44f-789ba9a1cfd7\") " pod="openstack/dnsmasq-dns-7fd796d7df-gtkj7" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.507412 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/def3f27c-03ff-4f92-895a-b3fb6ea64130-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-lrwtz\" (UID: \"def3f27c-03ff-4f92-895a-b3fb6ea64130\") " pod="openstack/ovn-controller-metrics-lrwtz" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.507450 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/def3f27c-03ff-4f92-895a-b3fb6ea64130-combined-ca-bundle\") pod \"ovn-controller-metrics-lrwtz\" (UID: \"def3f27c-03ff-4f92-895a-b3fb6ea64130\") " pod="openstack/ovn-controller-metrics-lrwtz" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.514656 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttrdc\" (UniqueName: \"kubernetes.io/projected/d146c23d-23c2-480b-b44f-789ba9a1cfd7-kube-api-access-ttrdc\") pod \"dnsmasq-dns-7fd796d7df-gtkj7\" (UID: \"d146c23d-23c2-480b-b44f-789ba9a1cfd7\") " pod="openstack/dnsmasq-dns-7fd796d7df-gtkj7" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.527819 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbm8z\" (UniqueName: \"kubernetes.io/projected/def3f27c-03ff-4f92-895a-b3fb6ea64130-kube-api-access-dbm8z\") pod \"ovn-controller-metrics-lrwtz\" (UID: \"def3f27c-03ff-4f92-895a-b3fb6ea64130\") " pod="openstack/ovn-controller-metrics-lrwtz" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.550069 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.559176 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-429lt"] Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.584870 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbbec9e9-0922-4f52-aafc-409365715a4a-config\") pod \"ovn-northd-0\" (UID: \"cbbec9e9-0922-4f52-aafc-409365715a4a\") " pod="openstack/ovn-northd-0" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.584926 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgtb8\" (UniqueName: \"kubernetes.io/projected/8ab31eac-215d-4527-80f1-68ef6224e8ed-kube-api-access-mgtb8\") pod \"dnsmasq-dns-86db49b7ff-429lt\" (UID: \"8ab31eac-215d-4527-80f1-68ef6224e8ed\") " pod="openstack/dnsmasq-dns-86db49b7ff-429lt" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.584954 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbbec9e9-0922-4f52-aafc-409365715a4a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"cbbec9e9-0922-4f52-aafc-409365715a4a\") " pod="openstack/ovn-northd-0" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.585002 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ab31eac-215d-4527-80f1-68ef6224e8ed-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-429lt\" (UID: \"8ab31eac-215d-4527-80f1-68ef6224e8ed\") " pod="openstack/dnsmasq-dns-86db49b7ff-429lt" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.585022 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ab31eac-215d-4527-80f1-68ef6224e8ed-config\") pod \"dnsmasq-dns-86db49b7ff-429lt\" (UID: \"8ab31eac-215d-4527-80f1-68ef6224e8ed\") " pod="openstack/dnsmasq-dns-86db49b7ff-429lt" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.585044 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbbec9e9-0922-4f52-aafc-409365715a4a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"cbbec9e9-0922-4f52-aafc-409365715a4a\") " pod="openstack/ovn-northd-0" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.585060 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ab31eac-215d-4527-80f1-68ef6224e8ed-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-429lt\" (UID: \"8ab31eac-215d-4527-80f1-68ef6224e8ed\") " pod="openstack/dnsmasq-dns-86db49b7ff-429lt" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.585084 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbbec9e9-0922-4f52-aafc-409365715a4a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"cbbec9e9-0922-4f52-aafc-409365715a4a\") " pod="openstack/ovn-northd-0" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.585105 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssf4m\" (UniqueName: \"kubernetes.io/projected/cbbec9e9-0922-4f52-aafc-409365715a4a-kube-api-access-ssf4m\") pod \"ovn-northd-0\" (UID: \"cbbec9e9-0922-4f52-aafc-409365715a4a\") " pod="openstack/ovn-northd-0" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.585124 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ab31eac-215d-4527-80f1-68ef6224e8ed-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-429lt\" (UID: \"8ab31eac-215d-4527-80f1-68ef6224e8ed\") " pod="openstack/dnsmasq-dns-86db49b7ff-429lt" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.585158 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cbbec9e9-0922-4f52-aafc-409365715a4a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"cbbec9e9-0922-4f52-aafc-409365715a4a\") " pod="openstack/ovn-northd-0" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.585175 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cbbec9e9-0922-4f52-aafc-409365715a4a-scripts\") pod \"ovn-northd-0\" (UID: \"cbbec9e9-0922-4f52-aafc-409365715a4a\") " pod="openstack/ovn-northd-0" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.605793 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-gtkj7" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.643162 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-lrwtz" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.686635 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbbec9e9-0922-4f52-aafc-409365715a4a-config\") pod \"ovn-northd-0\" (UID: \"cbbec9e9-0922-4f52-aafc-409365715a4a\") " pod="openstack/ovn-northd-0" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.686697 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgtb8\" (UniqueName: \"kubernetes.io/projected/8ab31eac-215d-4527-80f1-68ef6224e8ed-kube-api-access-mgtb8\") pod \"dnsmasq-dns-86db49b7ff-429lt\" (UID: \"8ab31eac-215d-4527-80f1-68ef6224e8ed\") " pod="openstack/dnsmasq-dns-86db49b7ff-429lt" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.686727 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbbec9e9-0922-4f52-aafc-409365715a4a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"cbbec9e9-0922-4f52-aafc-409365715a4a\") " pod="openstack/ovn-northd-0" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.686769 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ab31eac-215d-4527-80f1-68ef6224e8ed-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-429lt\" (UID: \"8ab31eac-215d-4527-80f1-68ef6224e8ed\") " pod="openstack/dnsmasq-dns-86db49b7ff-429lt" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.686791 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ab31eac-215d-4527-80f1-68ef6224e8ed-config\") pod \"dnsmasq-dns-86db49b7ff-429lt\" (UID: \"8ab31eac-215d-4527-80f1-68ef6224e8ed\") " pod="openstack/dnsmasq-dns-86db49b7ff-429lt" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.686811 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbbec9e9-0922-4f52-aafc-409365715a4a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"cbbec9e9-0922-4f52-aafc-409365715a4a\") " pod="openstack/ovn-northd-0" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.686825 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ab31eac-215d-4527-80f1-68ef6224e8ed-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-429lt\" (UID: \"8ab31eac-215d-4527-80f1-68ef6224e8ed\") " pod="openstack/dnsmasq-dns-86db49b7ff-429lt" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.686850 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbbec9e9-0922-4f52-aafc-409365715a4a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"cbbec9e9-0922-4f52-aafc-409365715a4a\") " pod="openstack/ovn-northd-0" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.686871 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssf4m\" (UniqueName: \"kubernetes.io/projected/cbbec9e9-0922-4f52-aafc-409365715a4a-kube-api-access-ssf4m\") pod \"ovn-northd-0\" (UID: \"cbbec9e9-0922-4f52-aafc-409365715a4a\") " pod="openstack/ovn-northd-0" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.686890 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ab31eac-215d-4527-80f1-68ef6224e8ed-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-429lt\" (UID: \"8ab31eac-215d-4527-80f1-68ef6224e8ed\") " pod="openstack/dnsmasq-dns-86db49b7ff-429lt" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.686910 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cbbec9e9-0922-4f52-aafc-409365715a4a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"cbbec9e9-0922-4f52-aafc-409365715a4a\") " pod="openstack/ovn-northd-0" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.686929 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cbbec9e9-0922-4f52-aafc-409365715a4a-scripts\") pod \"ovn-northd-0\" (UID: \"cbbec9e9-0922-4f52-aafc-409365715a4a\") " pod="openstack/ovn-northd-0" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.687730 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cbbec9e9-0922-4f52-aafc-409365715a4a-scripts\") pod \"ovn-northd-0\" (UID: \"cbbec9e9-0922-4f52-aafc-409365715a4a\") " pod="openstack/ovn-northd-0" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.688295 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbbec9e9-0922-4f52-aafc-409365715a4a-config\") pod \"ovn-northd-0\" (UID: \"cbbec9e9-0922-4f52-aafc-409365715a4a\") " pod="openstack/ovn-northd-0" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.690243 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ab31eac-215d-4527-80f1-68ef6224e8ed-config\") pod \"dnsmasq-dns-86db49b7ff-429lt\" (UID: \"8ab31eac-215d-4527-80f1-68ef6224e8ed\") " pod="openstack/dnsmasq-dns-86db49b7ff-429lt" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.691150 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ab31eac-215d-4527-80f1-68ef6224e8ed-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-429lt\" (UID: \"8ab31eac-215d-4527-80f1-68ef6224e8ed\") " pod="openstack/dnsmasq-dns-86db49b7ff-429lt" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.692803 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbbec9e9-0922-4f52-aafc-409365715a4a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"cbbec9e9-0922-4f52-aafc-409365715a4a\") " pod="openstack/ovn-northd-0" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.693348 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cbbec9e9-0922-4f52-aafc-409365715a4a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"cbbec9e9-0922-4f52-aafc-409365715a4a\") " pod="openstack/ovn-northd-0" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.693837 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbbec9e9-0922-4f52-aafc-409365715a4a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"cbbec9e9-0922-4f52-aafc-409365715a4a\") " pod="openstack/ovn-northd-0" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.695275 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbbec9e9-0922-4f52-aafc-409365715a4a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"cbbec9e9-0922-4f52-aafc-409365715a4a\") " pod="openstack/ovn-northd-0" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.695336 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ab31eac-215d-4527-80f1-68ef6224e8ed-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-429lt\" (UID: \"8ab31eac-215d-4527-80f1-68ef6224e8ed\") " pod="openstack/dnsmasq-dns-86db49b7ff-429lt" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.695896 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ab31eac-215d-4527-80f1-68ef6224e8ed-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-429lt\" (UID: \"8ab31eac-215d-4527-80f1-68ef6224e8ed\") " pod="openstack/dnsmasq-dns-86db49b7ff-429lt" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.718886 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgtb8\" (UniqueName: \"kubernetes.io/projected/8ab31eac-215d-4527-80f1-68ef6224e8ed-kube-api-access-mgtb8\") pod \"dnsmasq-dns-86db49b7ff-429lt\" (UID: \"8ab31eac-215d-4527-80f1-68ef6224e8ed\") " pod="openstack/dnsmasq-dns-86db49b7ff-429lt" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.719911 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssf4m\" (UniqueName: \"kubernetes.io/projected/cbbec9e9-0922-4f52-aafc-409365715a4a-kube-api-access-ssf4m\") pod \"ovn-northd-0\" (UID: \"cbbec9e9-0922-4f52-aafc-409365715a4a\") " pod="openstack/ovn-northd-0" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.759639 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-429lt" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.870914 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.871783 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"66d10600-3f91-4e77-a751-7f6fbe7148ea","Type":"ContainerStarted","Data":"5c53ba147faf4b7b70db3fb5a51851b183610bd77305810db55c1dbd025f559b"} Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.898710 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"daa4604a-2110-4000-a893-d7f308d29bce","Type":"ContainerStarted","Data":"f0cf382db5345fcd5dacda9c8f8ef441ac9ccef5416f1e87a90009558b780c4c"} Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.934303 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=19.01672426 podStartE2EDuration="34.934284596s" podCreationTimestamp="2026-03-19 15:34:40 +0000 UTC" firstStartedPulling="2026-03-19 15:34:50.57232115 +0000 UTC m=+1149.800942352" lastFinishedPulling="2026-03-19 15:35:06.489881486 +0000 UTC m=+1165.718502688" observedRunningTime="2026-03-19 15:35:14.919067371 +0000 UTC m=+1174.147688573" watchObservedRunningTime="2026-03-19 15:35:14.934284596 +0000 UTC m=+1174.162905798" Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.940280 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-jc5w2" podUID="511c87f1-6ccf-4c31-bc90-73af11c879e7" containerName="dnsmasq-dns" containerID="cri-o://b1f2cf36bf47d99b410085301faaf4eca6350d8eb43cac3bfaa182299a7610a7" gracePeriod=10 Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.940901 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-jc5w2" event={"ID":"511c87f1-6ccf-4c31-bc90-73af11c879e7","Type":"ContainerStarted","Data":"b1f2cf36bf47d99b410085301faaf4eca6350d8eb43cac3bfaa182299a7610a7"} Mar 19 15:35:14 crc kubenswrapper[4771]: I0319 15:35:14.997903 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=23.656826829 podStartE2EDuration="33.997882389s" podCreationTimestamp="2026-03-19 15:34:41 +0000 UTC" firstStartedPulling="2026-03-19 15:34:55.926016388 +0000 UTC m=+1155.154637590" lastFinishedPulling="2026-03-19 15:35:06.267071948 +0000 UTC m=+1165.495693150" observedRunningTime="2026-03-19 15:35:14.956748354 +0000 UTC m=+1174.185369556" watchObservedRunningTime="2026-03-19 15:35:14.997882389 +0000 UTC m=+1174.226503591" Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.013877 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-gtkj7"] Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.019106 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-jc5w2" podStartSLOduration=-9223371999.83569 podStartE2EDuration="37.019087126s" podCreationTimestamp="2026-03-19 15:34:38 +0000 UTC" firstStartedPulling="2026-03-19 15:34:39.623565714 +0000 UTC m=+1138.852186916" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:35:14.980452691 +0000 UTC m=+1174.209073903" watchObservedRunningTime="2026-03-19 15:35:15.019087126 +0000 UTC m=+1174.247708328" Mar 19 15:35:15 crc kubenswrapper[4771]: W0319 15:35:15.092933 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd146c23d_23c2_480b_b44f_789ba9a1cfd7.slice/crio-79d1728882f31c466ec00719ba8274dcf45ce7b0f07408b15303b0c7f34277bd WatchSource:0}: Error finding container 79d1728882f31c466ec00719ba8274dcf45ce7b0f07408b15303b0c7f34277bd: Status 404 returned error can't find the container with id 79d1728882f31c466ec00719ba8274dcf45ce7b0f07408b15303b0c7f34277bd Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.358259 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-lrwtz"] Mar 19 15:35:15 crc kubenswrapper[4771]: W0319 15:35:15.379759 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddef3f27c_03ff_4f92_895a_b3fb6ea64130.slice/crio-1f6bb13f6338bb662d10811d83b5cee1c3b6d11ea4d5a85f973cee31926abd12 WatchSource:0}: Error finding container 1f6bb13f6338bb662d10811d83b5cee1c3b6d11ea4d5a85f973cee31926abd12: Status 404 returned error can't find the container with id 1f6bb13f6338bb662d10811d83b5cee1c3b6d11ea4d5a85f973cee31926abd12 Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.474713 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.543333 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-gtkj7"] Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.566449 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-429lt"] Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.600796 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-p67w5"] Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.602960 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-p67w5" Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.606836 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-p67w5"] Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.657794 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-jc5w2" Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.662233 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 19 15:35:15 crc kubenswrapper[4771]: W0319 15:35:15.690296 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbbec9e9_0922_4f52_aafc_409365715a4a.slice/crio-4bf1768e27c8c4cfa05a02ad417e46811bde95c48d4dd801c2b8d81cfd6a78fb WatchSource:0}: Error finding container 4bf1768e27c8c4cfa05a02ad417e46811bde95c48d4dd801c2b8d81cfd6a78fb: Status 404 returned error can't find the container with id 4bf1768e27c8c4cfa05a02ad417e46811bde95c48d4dd801c2b8d81cfd6a78fb Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.722612 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw25d\" (UniqueName: \"kubernetes.io/projected/511c87f1-6ccf-4c31-bc90-73af11c879e7-kube-api-access-nw25d\") pod \"511c87f1-6ccf-4c31-bc90-73af11c879e7\" (UID: \"511c87f1-6ccf-4c31-bc90-73af11c879e7\") " Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.722837 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/511c87f1-6ccf-4c31-bc90-73af11c879e7-config\") pod \"511c87f1-6ccf-4c31-bc90-73af11c879e7\" (UID: \"511c87f1-6ccf-4c31-bc90-73af11c879e7\") " Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.722906 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/511c87f1-6ccf-4c31-bc90-73af11c879e7-dns-svc\") pod \"511c87f1-6ccf-4c31-bc90-73af11c879e7\" (UID: \"511c87f1-6ccf-4c31-bc90-73af11c879e7\") " Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.723129 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96edfd66-abb4-4935-adcf-22c80205e7c2-config\") pod \"dnsmasq-dns-698758b865-p67w5\" (UID: \"96edfd66-abb4-4935-adcf-22c80205e7c2\") " pod="openstack/dnsmasq-dns-698758b865-p67w5" Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.723203 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96edfd66-abb4-4935-adcf-22c80205e7c2-dns-svc\") pod \"dnsmasq-dns-698758b865-p67w5\" (UID: \"96edfd66-abb4-4935-adcf-22c80205e7c2\") " pod="openstack/dnsmasq-dns-698758b865-p67w5" Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.723243 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cpr5\" (UniqueName: \"kubernetes.io/projected/96edfd66-abb4-4935-adcf-22c80205e7c2-kube-api-access-9cpr5\") pod \"dnsmasq-dns-698758b865-p67w5\" (UID: \"96edfd66-abb4-4935-adcf-22c80205e7c2\") " pod="openstack/dnsmasq-dns-698758b865-p67w5" Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.725043 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/96edfd66-abb4-4935-adcf-22c80205e7c2-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-p67w5\" (UID: \"96edfd66-abb4-4935-adcf-22c80205e7c2\") " pod="openstack/dnsmasq-dns-698758b865-p67w5" Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.725066 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/96edfd66-abb4-4935-adcf-22c80205e7c2-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-p67w5\" (UID: \"96edfd66-abb4-4935-adcf-22c80205e7c2\") " pod="openstack/dnsmasq-dns-698758b865-p67w5" Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.729949 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/511c87f1-6ccf-4c31-bc90-73af11c879e7-kube-api-access-nw25d" (OuterVolumeSpecName: "kube-api-access-nw25d") pod "511c87f1-6ccf-4c31-bc90-73af11c879e7" (UID: "511c87f1-6ccf-4c31-bc90-73af11c879e7"). InnerVolumeSpecName "kube-api-access-nw25d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.792005 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/511c87f1-6ccf-4c31-bc90-73af11c879e7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "511c87f1-6ccf-4c31-bc90-73af11c879e7" (UID: "511c87f1-6ccf-4c31-bc90-73af11c879e7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.802163 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/511c87f1-6ccf-4c31-bc90-73af11c879e7-config" (OuterVolumeSpecName: "config") pod "511c87f1-6ccf-4c31-bc90-73af11c879e7" (UID: "511c87f1-6ccf-4c31-bc90-73af11c879e7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.829597 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96edfd66-abb4-4935-adcf-22c80205e7c2-dns-svc\") pod \"dnsmasq-dns-698758b865-p67w5\" (UID: \"96edfd66-abb4-4935-adcf-22c80205e7c2\") " pod="openstack/dnsmasq-dns-698758b865-p67w5" Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.829664 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cpr5\" (UniqueName: \"kubernetes.io/projected/96edfd66-abb4-4935-adcf-22c80205e7c2-kube-api-access-9cpr5\") pod \"dnsmasq-dns-698758b865-p67w5\" (UID: \"96edfd66-abb4-4935-adcf-22c80205e7c2\") " pod="openstack/dnsmasq-dns-698758b865-p67w5" Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.829699 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/96edfd66-abb4-4935-adcf-22c80205e7c2-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-p67w5\" (UID: \"96edfd66-abb4-4935-adcf-22c80205e7c2\") " pod="openstack/dnsmasq-dns-698758b865-p67w5" Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.830532 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96edfd66-abb4-4935-adcf-22c80205e7c2-dns-svc\") pod \"dnsmasq-dns-698758b865-p67w5\" (UID: \"96edfd66-abb4-4935-adcf-22c80205e7c2\") " pod="openstack/dnsmasq-dns-698758b865-p67w5" Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.830642 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/96edfd66-abb4-4935-adcf-22c80205e7c2-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-p67w5\" (UID: \"96edfd66-abb4-4935-adcf-22c80205e7c2\") " pod="openstack/dnsmasq-dns-698758b865-p67w5" Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.830691 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/96edfd66-abb4-4935-adcf-22c80205e7c2-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-p67w5\" (UID: \"96edfd66-abb4-4935-adcf-22c80205e7c2\") " pod="openstack/dnsmasq-dns-698758b865-p67w5" Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.830797 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96edfd66-abb4-4935-adcf-22c80205e7c2-config\") pod \"dnsmasq-dns-698758b865-p67w5\" (UID: \"96edfd66-abb4-4935-adcf-22c80205e7c2\") " pod="openstack/dnsmasq-dns-698758b865-p67w5" Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.830885 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/511c87f1-6ccf-4c31-bc90-73af11c879e7-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.830901 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/511c87f1-6ccf-4c31-bc90-73af11c879e7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.830914 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw25d\" (UniqueName: \"kubernetes.io/projected/511c87f1-6ccf-4c31-bc90-73af11c879e7-kube-api-access-nw25d\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.831488 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/96edfd66-abb4-4935-adcf-22c80205e7c2-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-p67w5\" (UID: \"96edfd66-abb4-4935-adcf-22c80205e7c2\") " pod="openstack/dnsmasq-dns-698758b865-p67w5" Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.831601 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96edfd66-abb4-4935-adcf-22c80205e7c2-config\") pod \"dnsmasq-dns-698758b865-p67w5\" (UID: \"96edfd66-abb4-4935-adcf-22c80205e7c2\") " pod="openstack/dnsmasq-dns-698758b865-p67w5" Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.848070 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cpr5\" (UniqueName: \"kubernetes.io/projected/96edfd66-abb4-4935-adcf-22c80205e7c2-kube-api-access-9cpr5\") pod \"dnsmasq-dns-698758b865-p67w5\" (UID: \"96edfd66-abb4-4935-adcf-22c80205e7c2\") " pod="openstack/dnsmasq-dns-698758b865-p67w5" Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.946363 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-p67w5" Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.949298 4771 generic.go:334] "Generic (PLEG): container finished" podID="8ab31eac-215d-4527-80f1-68ef6224e8ed" containerID="e0a154669c3c438ad194dfd8322f38a7d6df72af139a76e388a29dd9d2994add" exitCode=0 Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.949465 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-429lt" event={"ID":"8ab31eac-215d-4527-80f1-68ef6224e8ed","Type":"ContainerDied","Data":"e0a154669c3c438ad194dfd8322f38a7d6df72af139a76e388a29dd9d2994add"} Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.949523 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-429lt" event={"ID":"8ab31eac-215d-4527-80f1-68ef6224e8ed","Type":"ContainerStarted","Data":"0296ce5145112712e2ab33811536c4b587381f31b88a5b7f6f43723d50e645d4"} Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.951391 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cbbec9e9-0922-4f52-aafc-409365715a4a","Type":"ContainerStarted","Data":"4bf1768e27c8c4cfa05a02ad417e46811bde95c48d4dd801c2b8d81cfd6a78fb"} Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.952739 4771 generic.go:334] "Generic (PLEG): container finished" podID="d146c23d-23c2-480b-b44f-789ba9a1cfd7" containerID="7fb24555dcec70796b04da49eadb3fc5b3faaa6212d564c342f26c1ca8879226" exitCode=0 Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.952778 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-gtkj7" event={"ID":"d146c23d-23c2-480b-b44f-789ba9a1cfd7","Type":"ContainerDied","Data":"7fb24555dcec70796b04da49eadb3fc5b3faaa6212d564c342f26c1ca8879226"} Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.952805 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-gtkj7" event={"ID":"d146c23d-23c2-480b-b44f-789ba9a1cfd7","Type":"ContainerStarted","Data":"79d1728882f31c466ec00719ba8274dcf45ce7b0f07408b15303b0c7f34277bd"} Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.964073 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-lrwtz" event={"ID":"def3f27c-03ff-4f92-895a-b3fb6ea64130","Type":"ContainerStarted","Data":"615f22b33a956378f07b1fa58ca83befb9e27aa80ef2ef673b11ccd5ae48a05b"} Mar 19 15:35:15 crc kubenswrapper[4771]: I0319 15:35:15.964436 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-lrwtz" event={"ID":"def3f27c-03ff-4f92-895a-b3fb6ea64130","Type":"ContainerStarted","Data":"1f6bb13f6338bb662d10811d83b5cee1c3b6d11ea4d5a85f973cee31926abd12"} Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:15.995189 4771 generic.go:334] "Generic (PLEG): container finished" podID="511c87f1-6ccf-4c31-bc90-73af11c879e7" containerID="b1f2cf36bf47d99b410085301faaf4eca6350d8eb43cac3bfaa182299a7610a7" exitCode=0 Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:15.995253 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-jc5w2" event={"ID":"511c87f1-6ccf-4c31-bc90-73af11c879e7","Type":"ContainerDied","Data":"b1f2cf36bf47d99b410085301faaf4eca6350d8eb43cac3bfaa182299a7610a7"} Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:15.995270 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-jc5w2" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:15.995309 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-jc5w2" event={"ID":"511c87f1-6ccf-4c31-bc90-73af11c879e7","Type":"ContainerDied","Data":"39ed69d7e863a0c6c2b44b606c1c5fffad966fadb0cb1bf09a77040fa217d39e"} Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:15.995332 4771 scope.go:117] "RemoveContainer" containerID="b1f2cf36bf47d99b410085301faaf4eca6350d8eb43cac3bfaa182299a7610a7" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:15.996999 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-zfhtw" podUID="14a12c4b-9c29-45de-81ad-cc71b5235052" containerName="dnsmasq-dns" containerID="cri-o://c3e4150f84547d6bf6668ac809f295fb98c92e21403c917fa43159844e6a18c2" gracePeriod=10 Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.014055 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-lrwtz" podStartSLOduration=2.014038059 podStartE2EDuration="2.014038059s" podCreationTimestamp="2026-03-19 15:35:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:35:16.008967407 +0000 UTC m=+1175.237588609" watchObservedRunningTime="2026-03-19 15:35:16.014038059 +0000 UTC m=+1175.242659261" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.095832 4771 scope.go:117] "RemoveContainer" containerID="09722ed0a5fd8bb5caa5f33e2f4230a3dcf5d13292355d4961ff4e5e26700900" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.152976 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jc5w2"] Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.166057 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jc5w2"] Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.263586 4771 scope.go:117] "RemoveContainer" containerID="b1f2cf36bf47d99b410085301faaf4eca6350d8eb43cac3bfaa182299a7610a7" Mar 19 15:35:16 crc kubenswrapper[4771]: E0319 15:35:16.264352 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1f2cf36bf47d99b410085301faaf4eca6350d8eb43cac3bfaa182299a7610a7\": container with ID starting with b1f2cf36bf47d99b410085301faaf4eca6350d8eb43cac3bfaa182299a7610a7 not found: ID does not exist" containerID="b1f2cf36bf47d99b410085301faaf4eca6350d8eb43cac3bfaa182299a7610a7" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.264406 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1f2cf36bf47d99b410085301faaf4eca6350d8eb43cac3bfaa182299a7610a7"} err="failed to get container status \"b1f2cf36bf47d99b410085301faaf4eca6350d8eb43cac3bfaa182299a7610a7\": rpc error: code = NotFound desc = could not find container \"b1f2cf36bf47d99b410085301faaf4eca6350d8eb43cac3bfaa182299a7610a7\": container with ID starting with b1f2cf36bf47d99b410085301faaf4eca6350d8eb43cac3bfaa182299a7610a7 not found: ID does not exist" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.264437 4771 scope.go:117] "RemoveContainer" containerID="09722ed0a5fd8bb5caa5f33e2f4230a3dcf5d13292355d4961ff4e5e26700900" Mar 19 15:35:16 crc kubenswrapper[4771]: E0319 15:35:16.264782 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09722ed0a5fd8bb5caa5f33e2f4230a3dcf5d13292355d4961ff4e5e26700900\": container with ID starting with 09722ed0a5fd8bb5caa5f33e2f4230a3dcf5d13292355d4961ff4e5e26700900 not found: ID does not exist" containerID="09722ed0a5fd8bb5caa5f33e2f4230a3dcf5d13292355d4961ff4e5e26700900" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.264812 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09722ed0a5fd8bb5caa5f33e2f4230a3dcf5d13292355d4961ff4e5e26700900"} err="failed to get container status \"09722ed0a5fd8bb5caa5f33e2f4230a3dcf5d13292355d4961ff4e5e26700900\": rpc error: code = NotFound desc = could not find container \"09722ed0a5fd8bb5caa5f33e2f4230a3dcf5d13292355d4961ff4e5e26700900\": container with ID starting with 09722ed0a5fd8bb5caa5f33e2f4230a3dcf5d13292355d4961ff4e5e26700900 not found: ID does not exist" Mar 19 15:35:16 crc kubenswrapper[4771]: E0319 15:35:16.383499 4771 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 19 15:35:16 crc kubenswrapper[4771]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/d146c23d-23c2-480b-b44f-789ba9a1cfd7/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 19 15:35:16 crc kubenswrapper[4771]: > podSandboxID="79d1728882f31c466ec00719ba8274dcf45ce7b0f07408b15303b0c7f34277bd" Mar 19 15:35:16 crc kubenswrapper[4771]: E0319 15:35:16.383655 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 15:35:16 crc kubenswrapper[4771]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5bfh5d7h8hd8h664h564hfbh5d4h5f5h55h5fch66h675hb8h65bh64dhbh5dchc9h66fh5dbhf4h658h64ch55bhbh65h55dh597h68dh579hbdq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ttrdc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7fd796d7df-gtkj7_openstack(d146c23d-23c2-480b-b44f-789ba9a1cfd7): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/d146c23d-23c2-480b-b44f-789ba9a1cfd7/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 19 15:35:16 crc kubenswrapper[4771]: > logger="UnhandledError" Mar 19 15:35:16 crc kubenswrapper[4771]: E0319 15:35:16.385478 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/d146c23d-23c2-480b-b44f-789ba9a1cfd7/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-7fd796d7df-gtkj7" podUID="d146c23d-23c2-480b-b44f-789ba9a1cfd7" Mar 19 15:35:16 crc kubenswrapper[4771]: E0319 15:35:16.397319 4771 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 19 15:35:16 crc kubenswrapper[4771]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/8ab31eac-215d-4527-80f1-68ef6224e8ed/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 19 15:35:16 crc kubenswrapper[4771]: > podSandboxID="0296ce5145112712e2ab33811536c4b587381f31b88a5b7f6f43723d50e645d4" Mar 19 15:35:16 crc kubenswrapper[4771]: E0319 15:35:16.397545 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 15:35:16 crc kubenswrapper[4771]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n599h5cbh7ch5d4h66fh676hdbh546h95h88h5ffh55ch7fhch57ch687hddhc7h5fdh57dh674h56fh64ch98h9bh557h55dh646h54ch54fh5c4h597q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mgtb8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86db49b7ff-429lt_openstack(8ab31eac-215d-4527-80f1-68ef6224e8ed): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/8ab31eac-215d-4527-80f1-68ef6224e8ed/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 19 15:35:16 crc kubenswrapper[4771]: > logger="UnhandledError" Mar 19 15:35:16 crc kubenswrapper[4771]: E0319 15:35:16.400299 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/8ab31eac-215d-4527-80f1-68ef6224e8ed/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-86db49b7ff-429lt" podUID="8ab31eac-215d-4527-80f1-68ef6224e8ed" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.515451 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-zfhtw" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.577877 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-p67w5"] Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.649427 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7xg5\" (UniqueName: \"kubernetes.io/projected/14a12c4b-9c29-45de-81ad-cc71b5235052-kube-api-access-d7xg5\") pod \"14a12c4b-9c29-45de-81ad-cc71b5235052\" (UID: \"14a12c4b-9c29-45de-81ad-cc71b5235052\") " Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.649494 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14a12c4b-9c29-45de-81ad-cc71b5235052-dns-svc\") pod \"14a12c4b-9c29-45de-81ad-cc71b5235052\" (UID: \"14a12c4b-9c29-45de-81ad-cc71b5235052\") " Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.649560 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14a12c4b-9c29-45de-81ad-cc71b5235052-config\") pod \"14a12c4b-9c29-45de-81ad-cc71b5235052\" (UID: \"14a12c4b-9c29-45de-81ad-cc71b5235052\") " Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.653619 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14a12c4b-9c29-45de-81ad-cc71b5235052-kube-api-access-d7xg5" (OuterVolumeSpecName: "kube-api-access-d7xg5") pod "14a12c4b-9c29-45de-81ad-cc71b5235052" (UID: "14a12c4b-9c29-45de-81ad-cc71b5235052"). InnerVolumeSpecName "kube-api-access-d7xg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.698143 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14a12c4b-9c29-45de-81ad-cc71b5235052-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "14a12c4b-9c29-45de-81ad-cc71b5235052" (UID: "14a12c4b-9c29-45de-81ad-cc71b5235052"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.709873 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 19 15:35:16 crc kubenswrapper[4771]: E0319 15:35:16.710248 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a12c4b-9c29-45de-81ad-cc71b5235052" containerName="init" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.710271 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a12c4b-9c29-45de-81ad-cc71b5235052" containerName="init" Mar 19 15:35:16 crc kubenswrapper[4771]: E0319 15:35:16.710307 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a12c4b-9c29-45de-81ad-cc71b5235052" containerName="dnsmasq-dns" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.710314 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a12c4b-9c29-45de-81ad-cc71b5235052" containerName="dnsmasq-dns" Mar 19 15:35:16 crc kubenswrapper[4771]: E0319 15:35:16.710330 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="511c87f1-6ccf-4c31-bc90-73af11c879e7" containerName="init" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.710336 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="511c87f1-6ccf-4c31-bc90-73af11c879e7" containerName="init" Mar 19 15:35:16 crc kubenswrapper[4771]: E0319 15:35:16.710356 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="511c87f1-6ccf-4c31-bc90-73af11c879e7" containerName="dnsmasq-dns" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.710362 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="511c87f1-6ccf-4c31-bc90-73af11c879e7" containerName="dnsmasq-dns" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.710555 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="14a12c4b-9c29-45de-81ad-cc71b5235052" containerName="dnsmasq-dns" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.710580 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="511c87f1-6ccf-4c31-bc90-73af11c879e7" containerName="dnsmasq-dns" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.715747 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.716037 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.719581 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.720086 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.720241 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.722164 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-dxms8" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.727786 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14a12c4b-9c29-45de-81ad-cc71b5235052-config" (OuterVolumeSpecName: "config") pod "14a12c4b-9c29-45de-81ad-cc71b5235052" (UID: "14a12c4b-9c29-45de-81ad-cc71b5235052"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.753603 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7xg5\" (UniqueName: \"kubernetes.io/projected/14a12c4b-9c29-45de-81ad-cc71b5235052-kube-api-access-d7xg5\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.753642 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14a12c4b-9c29-45de-81ad-cc71b5235052-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.753653 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14a12c4b-9c29-45de-81ad-cc71b5235052-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.855370 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d58e24-649b-4142-a62a-64c9919fe0e4-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"67d58e24-649b-4142-a62a-64c9919fe0e4\") " pod="openstack/swift-storage-0" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.855458 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/67d58e24-649b-4142-a62a-64c9919fe0e4-lock\") pod \"swift-storage-0\" (UID: \"67d58e24-649b-4142-a62a-64c9919fe0e4\") " pod="openstack/swift-storage-0" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.855483 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b7b5\" (UniqueName: \"kubernetes.io/projected/67d58e24-649b-4142-a62a-64c9919fe0e4-kube-api-access-4b7b5\") pod \"swift-storage-0\" (UID: \"67d58e24-649b-4142-a62a-64c9919fe0e4\") " pod="openstack/swift-storage-0" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.855781 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67d58e24-649b-4142-a62a-64c9919fe0e4-etc-swift\") pod \"swift-storage-0\" (UID: \"67d58e24-649b-4142-a62a-64c9919fe0e4\") " pod="openstack/swift-storage-0" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.855887 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/67d58e24-649b-4142-a62a-64c9919fe0e4-cache\") pod \"swift-storage-0\" (UID: \"67d58e24-649b-4142-a62a-64c9919fe0e4\") " pod="openstack/swift-storage-0" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.856038 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"67d58e24-649b-4142-a62a-64c9919fe0e4\") " pod="openstack/swift-storage-0" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.957270 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67d58e24-649b-4142-a62a-64c9919fe0e4-etc-swift\") pod \"swift-storage-0\" (UID: \"67d58e24-649b-4142-a62a-64c9919fe0e4\") " pod="openstack/swift-storage-0" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.957338 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/67d58e24-649b-4142-a62a-64c9919fe0e4-cache\") pod \"swift-storage-0\" (UID: \"67d58e24-649b-4142-a62a-64c9919fe0e4\") " pod="openstack/swift-storage-0" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.957400 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"67d58e24-649b-4142-a62a-64c9919fe0e4\") " pod="openstack/swift-storage-0" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.957445 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d58e24-649b-4142-a62a-64c9919fe0e4-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"67d58e24-649b-4142-a62a-64c9919fe0e4\") " pod="openstack/swift-storage-0" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.957476 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/67d58e24-649b-4142-a62a-64c9919fe0e4-lock\") pod \"swift-storage-0\" (UID: \"67d58e24-649b-4142-a62a-64c9919fe0e4\") " pod="openstack/swift-storage-0" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.957498 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b7b5\" (UniqueName: \"kubernetes.io/projected/67d58e24-649b-4142-a62a-64c9919fe0e4-kube-api-access-4b7b5\") pod \"swift-storage-0\" (UID: \"67d58e24-649b-4142-a62a-64c9919fe0e4\") " pod="openstack/swift-storage-0" Mar 19 15:35:16 crc kubenswrapper[4771]: E0319 15:35:16.957504 4771 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 15:35:16 crc kubenswrapper[4771]: E0319 15:35:16.957534 4771 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 15:35:16 crc kubenswrapper[4771]: E0319 15:35:16.957598 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/67d58e24-649b-4142-a62a-64c9919fe0e4-etc-swift podName:67d58e24-649b-4142-a62a-64c9919fe0e4 nodeName:}" failed. No retries permitted until 2026-03-19 15:35:17.45757524 +0000 UTC m=+1176.686196522 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/67d58e24-649b-4142-a62a-64c9919fe0e4-etc-swift") pod "swift-storage-0" (UID: "67d58e24-649b-4142-a62a-64c9919fe0e4") : configmap "swift-ring-files" not found Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.958357 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/67d58e24-649b-4142-a62a-64c9919fe0e4-cache\") pod \"swift-storage-0\" (UID: \"67d58e24-649b-4142-a62a-64c9919fe0e4\") " pod="openstack/swift-storage-0" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.958445 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/67d58e24-649b-4142-a62a-64c9919fe0e4-lock\") pod \"swift-storage-0\" (UID: \"67d58e24-649b-4142-a62a-64c9919fe0e4\") " pod="openstack/swift-storage-0" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.958833 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"67d58e24-649b-4142-a62a-64c9919fe0e4\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/swift-storage-0" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.962850 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d58e24-649b-4142-a62a-64c9919fe0e4-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"67d58e24-649b-4142-a62a-64c9919fe0e4\") " pod="openstack/swift-storage-0" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.978868 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b7b5\" (UniqueName: \"kubernetes.io/projected/67d58e24-649b-4142-a62a-64c9919fe0e4-kube-api-access-4b7b5\") pod \"swift-storage-0\" (UID: \"67d58e24-649b-4142-a62a-64c9919fe0e4\") " pod="openstack/swift-storage-0" Mar 19 15:35:16 crc kubenswrapper[4771]: I0319 15:35:16.990351 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"67d58e24-649b-4142-a62a-64c9919fe0e4\") " pod="openstack/swift-storage-0" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.005480 4771 generic.go:334] "Generic (PLEG): container finished" podID="14a12c4b-9c29-45de-81ad-cc71b5235052" containerID="c3e4150f84547d6bf6668ac809f295fb98c92e21403c917fa43159844e6a18c2" exitCode=0 Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.005572 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-zfhtw" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.005575 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-zfhtw" event={"ID":"14a12c4b-9c29-45de-81ad-cc71b5235052","Type":"ContainerDied","Data":"c3e4150f84547d6bf6668ac809f295fb98c92e21403c917fa43159844e6a18c2"} Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.005729 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-zfhtw" event={"ID":"14a12c4b-9c29-45de-81ad-cc71b5235052","Type":"ContainerDied","Data":"5a1da17fc129dedd6bf28cb80892e974e2c4a7e611c31f108b8f1df27d053758"} Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.005766 4771 scope.go:117] "RemoveContainer" containerID="c3e4150f84547d6bf6668ac809f295fb98c92e21403c917fa43159844e6a18c2" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.009098 4771 generic.go:334] "Generic (PLEG): container finished" podID="96edfd66-abb4-4935-adcf-22c80205e7c2" containerID="b01204d15c6fd22ef3be6aa795baa0db56c6423b963e3d8a8bc2cd48f076bacc" exitCode=0 Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.010665 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-p67w5" event={"ID":"96edfd66-abb4-4935-adcf-22c80205e7c2","Type":"ContainerDied","Data":"b01204d15c6fd22ef3be6aa795baa0db56c6423b963e3d8a8bc2cd48f076bacc"} Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.010711 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-p67w5" event={"ID":"96edfd66-abb4-4935-adcf-22c80205e7c2","Type":"ContainerStarted","Data":"2a1605b8dde876a4c894a40e6499c87509299d74d90ba003a69e757f0816ee8c"} Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.064121 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-zfhtw"] Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.072801 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-zfhtw"] Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.086692 4771 scope.go:117] "RemoveContainer" containerID="59d6e60a4c705ad336c739aa579f9ae6eab69cb252b08d75aebdfabd91f9fa83" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.153043 4771 scope.go:117] "RemoveContainer" containerID="c3e4150f84547d6bf6668ac809f295fb98c92e21403c917fa43159844e6a18c2" Mar 19 15:35:17 crc kubenswrapper[4771]: E0319 15:35:17.162054 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3e4150f84547d6bf6668ac809f295fb98c92e21403c917fa43159844e6a18c2\": container with ID starting with c3e4150f84547d6bf6668ac809f295fb98c92e21403c917fa43159844e6a18c2 not found: ID does not exist" containerID="c3e4150f84547d6bf6668ac809f295fb98c92e21403c917fa43159844e6a18c2" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.162098 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3e4150f84547d6bf6668ac809f295fb98c92e21403c917fa43159844e6a18c2"} err="failed to get container status \"c3e4150f84547d6bf6668ac809f295fb98c92e21403c917fa43159844e6a18c2\": rpc error: code = NotFound desc = could not find container \"c3e4150f84547d6bf6668ac809f295fb98c92e21403c917fa43159844e6a18c2\": container with ID starting with c3e4150f84547d6bf6668ac809f295fb98c92e21403c917fa43159844e6a18c2 not found: ID does not exist" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.162122 4771 scope.go:117] "RemoveContainer" containerID="59d6e60a4c705ad336c739aa579f9ae6eab69cb252b08d75aebdfabd91f9fa83" Mar 19 15:35:17 crc kubenswrapper[4771]: E0319 15:35:17.167468 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59d6e60a4c705ad336c739aa579f9ae6eab69cb252b08d75aebdfabd91f9fa83\": container with ID starting with 59d6e60a4c705ad336c739aa579f9ae6eab69cb252b08d75aebdfabd91f9fa83 not found: ID does not exist" containerID="59d6e60a4c705ad336c739aa579f9ae6eab69cb252b08d75aebdfabd91f9fa83" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.167517 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59d6e60a4c705ad336c739aa579f9ae6eab69cb252b08d75aebdfabd91f9fa83"} err="failed to get container status \"59d6e60a4c705ad336c739aa579f9ae6eab69cb252b08d75aebdfabd91f9fa83\": rpc error: code = NotFound desc = could not find container \"59d6e60a4c705ad336c739aa579f9ae6eab69cb252b08d75aebdfabd91f9fa83\": container with ID starting with 59d6e60a4c705ad336c739aa579f9ae6eab69cb252b08d75aebdfabd91f9fa83 not found: ID does not exist" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.188328 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-7tbkt"] Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.189932 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7tbkt" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.194040 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.194311 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.194433 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.285498 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-7tbkt"] Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.365660 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-scripts\") pod \"swift-ring-rebalance-7tbkt\" (UID: \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\") " pod="openstack/swift-ring-rebalance-7tbkt" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.366067 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5kpb\" (UniqueName: \"kubernetes.io/projected/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-kube-api-access-k5kpb\") pod \"swift-ring-rebalance-7tbkt\" (UID: \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\") " pod="openstack/swift-ring-rebalance-7tbkt" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.366120 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-ring-data-devices\") pod \"swift-ring-rebalance-7tbkt\" (UID: \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\") " pod="openstack/swift-ring-rebalance-7tbkt" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.366193 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-combined-ca-bundle\") pod \"swift-ring-rebalance-7tbkt\" (UID: \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\") " pod="openstack/swift-ring-rebalance-7tbkt" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.366240 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-swiftconf\") pod \"swift-ring-rebalance-7tbkt\" (UID: \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\") " pod="openstack/swift-ring-rebalance-7tbkt" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.366267 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-etc-swift\") pod \"swift-ring-rebalance-7tbkt\" (UID: \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\") " pod="openstack/swift-ring-rebalance-7tbkt" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.366312 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-dispersionconf\") pod \"swift-ring-rebalance-7tbkt\" (UID: \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\") " pod="openstack/swift-ring-rebalance-7tbkt" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.387564 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-gtkj7" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.467849 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-swiftconf\") pod \"swift-ring-rebalance-7tbkt\" (UID: \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\") " pod="openstack/swift-ring-rebalance-7tbkt" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.467893 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-etc-swift\") pod \"swift-ring-rebalance-7tbkt\" (UID: \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\") " pod="openstack/swift-ring-rebalance-7tbkt" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.467935 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-dispersionconf\") pod \"swift-ring-rebalance-7tbkt\" (UID: \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\") " pod="openstack/swift-ring-rebalance-7tbkt" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.468006 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-scripts\") pod \"swift-ring-rebalance-7tbkt\" (UID: \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\") " pod="openstack/swift-ring-rebalance-7tbkt" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.468067 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5kpb\" (UniqueName: \"kubernetes.io/projected/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-kube-api-access-k5kpb\") pod \"swift-ring-rebalance-7tbkt\" (UID: \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\") " pod="openstack/swift-ring-rebalance-7tbkt" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.468101 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-ring-data-devices\") pod \"swift-ring-rebalance-7tbkt\" (UID: \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\") " pod="openstack/swift-ring-rebalance-7tbkt" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.468138 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67d58e24-649b-4142-a62a-64c9919fe0e4-etc-swift\") pod \"swift-storage-0\" (UID: \"67d58e24-649b-4142-a62a-64c9919fe0e4\") " pod="openstack/swift-storage-0" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.468163 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-combined-ca-bundle\") pod \"swift-ring-rebalance-7tbkt\" (UID: \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\") " pod="openstack/swift-ring-rebalance-7tbkt" Mar 19 15:35:17 crc kubenswrapper[4771]: E0319 15:35:17.469253 4771 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 15:35:17 crc kubenswrapper[4771]: E0319 15:35:17.469286 4771 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 15:35:17 crc kubenswrapper[4771]: E0319 15:35:17.469339 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/67d58e24-649b-4142-a62a-64c9919fe0e4-etc-swift podName:67d58e24-649b-4142-a62a-64c9919fe0e4 nodeName:}" failed. No retries permitted until 2026-03-19 15:35:18.469316497 +0000 UTC m=+1177.697937699 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/67d58e24-649b-4142-a62a-64c9919fe0e4-etc-swift") pod "swift-storage-0" (UID: "67d58e24-649b-4142-a62a-64c9919fe0e4") : configmap "swift-ring-files" not found Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.469516 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-ring-data-devices\") pod \"swift-ring-rebalance-7tbkt\" (UID: \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\") " pod="openstack/swift-ring-rebalance-7tbkt" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.469520 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-scripts\") pod \"swift-ring-rebalance-7tbkt\" (UID: \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\") " pod="openstack/swift-ring-rebalance-7tbkt" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.469615 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-etc-swift\") pod \"swift-ring-rebalance-7tbkt\" (UID: \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\") " pod="openstack/swift-ring-rebalance-7tbkt" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.472154 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-dispersionconf\") pod \"swift-ring-rebalance-7tbkt\" (UID: \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\") " pod="openstack/swift-ring-rebalance-7tbkt" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.472389 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-swiftconf\") pod \"swift-ring-rebalance-7tbkt\" (UID: \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\") " pod="openstack/swift-ring-rebalance-7tbkt" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.472845 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-combined-ca-bundle\") pod \"swift-ring-rebalance-7tbkt\" (UID: \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\") " pod="openstack/swift-ring-rebalance-7tbkt" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.487030 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5kpb\" (UniqueName: \"kubernetes.io/projected/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-kube-api-access-k5kpb\") pod \"swift-ring-rebalance-7tbkt\" (UID: \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\") " pod="openstack/swift-ring-rebalance-7tbkt" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.519727 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14a12c4b-9c29-45de-81ad-cc71b5235052" path="/var/lib/kubelet/pods/14a12c4b-9c29-45de-81ad-cc71b5235052/volumes" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.520425 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="511c87f1-6ccf-4c31-bc90-73af11c879e7" path="/var/lib/kubelet/pods/511c87f1-6ccf-4c31-bc90-73af11c879e7/volumes" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.569075 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d146c23d-23c2-480b-b44f-789ba9a1cfd7-ovsdbserver-nb\") pod \"d146c23d-23c2-480b-b44f-789ba9a1cfd7\" (UID: \"d146c23d-23c2-480b-b44f-789ba9a1cfd7\") " Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.569271 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d146c23d-23c2-480b-b44f-789ba9a1cfd7-dns-svc\") pod \"d146c23d-23c2-480b-b44f-789ba9a1cfd7\" (UID: \"d146c23d-23c2-480b-b44f-789ba9a1cfd7\") " Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.569373 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttrdc\" (UniqueName: \"kubernetes.io/projected/d146c23d-23c2-480b-b44f-789ba9a1cfd7-kube-api-access-ttrdc\") pod \"d146c23d-23c2-480b-b44f-789ba9a1cfd7\" (UID: \"d146c23d-23c2-480b-b44f-789ba9a1cfd7\") " Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.569413 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d146c23d-23c2-480b-b44f-789ba9a1cfd7-config\") pod \"d146c23d-23c2-480b-b44f-789ba9a1cfd7\" (UID: \"d146c23d-23c2-480b-b44f-789ba9a1cfd7\") " Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.573732 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d146c23d-23c2-480b-b44f-789ba9a1cfd7-kube-api-access-ttrdc" (OuterVolumeSpecName: "kube-api-access-ttrdc") pod "d146c23d-23c2-480b-b44f-789ba9a1cfd7" (UID: "d146c23d-23c2-480b-b44f-789ba9a1cfd7"). InnerVolumeSpecName "kube-api-access-ttrdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.616700 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d146c23d-23c2-480b-b44f-789ba9a1cfd7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d146c23d-23c2-480b-b44f-789ba9a1cfd7" (UID: "d146c23d-23c2-480b-b44f-789ba9a1cfd7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.627220 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d146c23d-23c2-480b-b44f-789ba9a1cfd7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d146c23d-23c2-480b-b44f-789ba9a1cfd7" (UID: "d146c23d-23c2-480b-b44f-789ba9a1cfd7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.627309 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d146c23d-23c2-480b-b44f-789ba9a1cfd7-config" (OuterVolumeSpecName: "config") pod "d146c23d-23c2-480b-b44f-789ba9a1cfd7" (UID: "d146c23d-23c2-480b-b44f-789ba9a1cfd7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.671572 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttrdc\" (UniqueName: \"kubernetes.io/projected/d146c23d-23c2-480b-b44f-789ba9a1cfd7-kube-api-access-ttrdc\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.671615 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d146c23d-23c2-480b-b44f-789ba9a1cfd7-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.671628 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d146c23d-23c2-480b-b44f-789ba9a1cfd7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.671640 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d146c23d-23c2-480b-b44f-789ba9a1cfd7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:17 crc kubenswrapper[4771]: I0319 15:35:17.684023 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7tbkt" Mar 19 15:35:18 crc kubenswrapper[4771]: I0319 15:35:18.018892 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-p67w5" event={"ID":"96edfd66-abb4-4935-adcf-22c80205e7c2","Type":"ContainerStarted","Data":"c1957c063a8828a1191cca0264d683874c10c5562f461010bbde789460c7963c"} Mar 19 15:35:18 crc kubenswrapper[4771]: I0319 15:35:18.019381 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-p67w5" Mar 19 15:35:18 crc kubenswrapper[4771]: I0319 15:35:18.021944 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-429lt" event={"ID":"8ab31eac-215d-4527-80f1-68ef6224e8ed","Type":"ContainerStarted","Data":"6a719c8b560f195535b728f0b5999dbb13e68daf4c4500c9840fe96e294fa2d2"} Mar 19 15:35:18 crc kubenswrapper[4771]: I0319 15:35:18.022304 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-429lt" Mar 19 15:35:18 crc kubenswrapper[4771]: I0319 15:35:18.024788 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cbbec9e9-0922-4f52-aafc-409365715a4a","Type":"ContainerStarted","Data":"11bea890cbce29eddbcc4f83d49c0f4c0e4b4f39d5e41e4ab7df357227fd6428"} Mar 19 15:35:18 crc kubenswrapper[4771]: I0319 15:35:18.024833 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cbbec9e9-0922-4f52-aafc-409365715a4a","Type":"ContainerStarted","Data":"93a31a2a061c9e636b2151a7eea0329716ad1f8d6f236fec28a4e821f505f406"} Mar 19 15:35:18 crc kubenswrapper[4771]: I0319 15:35:18.024934 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 19 15:35:18 crc kubenswrapper[4771]: I0319 15:35:18.028395 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-gtkj7" event={"ID":"d146c23d-23c2-480b-b44f-789ba9a1cfd7","Type":"ContainerDied","Data":"79d1728882f31c466ec00719ba8274dcf45ce7b0f07408b15303b0c7f34277bd"} Mar 19 15:35:18 crc kubenswrapper[4771]: I0319 15:35:18.028486 4771 scope.go:117] "RemoveContainer" containerID="7fb24555dcec70796b04da49eadb3fc5b3faaa6212d564c342f26c1ca8879226" Mar 19 15:35:18 crc kubenswrapper[4771]: I0319 15:35:18.028512 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-gtkj7" Mar 19 15:35:18 crc kubenswrapper[4771]: I0319 15:35:18.042743 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-p67w5" podStartSLOduration=3.042726162 podStartE2EDuration="3.042726162s" podCreationTimestamp="2026-03-19 15:35:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:35:18.039570256 +0000 UTC m=+1177.268191468" watchObservedRunningTime="2026-03-19 15:35:18.042726162 +0000 UTC m=+1177.271347364" Mar 19 15:35:18 crc kubenswrapper[4771]: I0319 15:35:18.066840 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-429lt" podStartSLOduration=4.066816359 podStartE2EDuration="4.066816359s" podCreationTimestamp="2026-03-19 15:35:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:35:18.062308821 +0000 UTC m=+1177.290930033" watchObservedRunningTime="2026-03-19 15:35:18.066816359 +0000 UTC m=+1177.295437561" Mar 19 15:35:18 crc kubenswrapper[4771]: I0319 15:35:18.085931 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.624950723 podStartE2EDuration="4.085910287s" podCreationTimestamp="2026-03-19 15:35:14 +0000 UTC" firstStartedPulling="2026-03-19 15:35:15.691725469 +0000 UTC m=+1174.920346671" lastFinishedPulling="2026-03-19 15:35:17.152685033 +0000 UTC m=+1176.381306235" observedRunningTime="2026-03-19 15:35:18.077869214 +0000 UTC m=+1177.306490426" watchObservedRunningTime="2026-03-19 15:35:18.085910287 +0000 UTC m=+1177.314531509" Mar 19 15:35:18 crc kubenswrapper[4771]: I0319 15:35:18.125928 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-gtkj7"] Mar 19 15:35:18 crc kubenswrapper[4771]: I0319 15:35:18.133535 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-gtkj7"] Mar 19 15:35:18 crc kubenswrapper[4771]: I0319 15:35:18.174306 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-7tbkt"] Mar 19 15:35:18 crc kubenswrapper[4771]: I0319 15:35:18.490653 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67d58e24-649b-4142-a62a-64c9919fe0e4-etc-swift\") pod \"swift-storage-0\" (UID: \"67d58e24-649b-4142-a62a-64c9919fe0e4\") " pod="openstack/swift-storage-0" Mar 19 15:35:18 crc kubenswrapper[4771]: E0319 15:35:18.490888 4771 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 15:35:18 crc kubenswrapper[4771]: E0319 15:35:18.490922 4771 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 15:35:18 crc kubenswrapper[4771]: E0319 15:35:18.491007 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/67d58e24-649b-4142-a62a-64c9919fe0e4-etc-swift podName:67d58e24-649b-4142-a62a-64c9919fe0e4 nodeName:}" failed. No retries permitted until 2026-03-19 15:35:20.490969259 +0000 UTC m=+1179.719590461 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/67d58e24-649b-4142-a62a-64c9919fe0e4-etc-swift") pod "swift-storage-0" (UID: "67d58e24-649b-4142-a62a-64c9919fe0e4") : configmap "swift-ring-files" not found Mar 19 15:35:19 crc kubenswrapper[4771]: I0319 15:35:19.037543 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7tbkt" event={"ID":"fde481a0-3182-4ad8-90ff-7fc8da0ecde2","Type":"ContainerStarted","Data":"7964ba80782b5c5d3f48e53d5b86497f3530b9e47e3538f1b0f84ad8fcea4bd5"} Mar 19 15:35:19 crc kubenswrapper[4771]: I0319 15:35:19.520360 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d146c23d-23c2-480b-b44f-789ba9a1cfd7" path="/var/lib/kubelet/pods/d146c23d-23c2-480b-b44f-789ba9a1cfd7/volumes" Mar 19 15:35:19 crc kubenswrapper[4771]: E0319 15:35:19.614507 4771 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.50:41660->38.102.83.50:41021: read tcp 38.102.83.50:41660->38.102.83.50:41021: read: connection reset by peer Mar 19 15:35:20 crc kubenswrapper[4771]: I0319 15:35:20.526699 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67d58e24-649b-4142-a62a-64c9919fe0e4-etc-swift\") pod \"swift-storage-0\" (UID: \"67d58e24-649b-4142-a62a-64c9919fe0e4\") " pod="openstack/swift-storage-0" Mar 19 15:35:20 crc kubenswrapper[4771]: E0319 15:35:20.527960 4771 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 15:35:20 crc kubenswrapper[4771]: E0319 15:35:20.528004 4771 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 15:35:20 crc kubenswrapper[4771]: E0319 15:35:20.528054 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/67d58e24-649b-4142-a62a-64c9919fe0e4-etc-swift podName:67d58e24-649b-4142-a62a-64c9919fe0e4 nodeName:}" failed. No retries permitted until 2026-03-19 15:35:24.528037323 +0000 UTC m=+1183.756658515 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/67d58e24-649b-4142-a62a-64c9919fe0e4-etc-swift") pod "swift-storage-0" (UID: "67d58e24-649b-4142-a62a-64c9919fe0e4") : configmap "swift-ring-files" not found Mar 19 15:35:21 crc kubenswrapper[4771]: I0319 15:35:21.726249 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 19 15:35:21 crc kubenswrapper[4771]: I0319 15:35:21.726584 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 19 15:35:21 crc kubenswrapper[4771]: I0319 15:35:21.791720 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 19 15:35:22 crc kubenswrapper[4771]: I0319 15:35:22.132372 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 19 15:35:22 crc kubenswrapper[4771]: I0319 15:35:22.988034 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 19 15:35:22 crc kubenswrapper[4771]: I0319 15:35:22.988522 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 19 15:35:23 crc kubenswrapper[4771]: I0319 15:35:23.028000 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:35:23 crc kubenswrapper[4771]: I0319 15:35:23.028068 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:35:23 crc kubenswrapper[4771]: I0319 15:35:23.070886 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7tbkt" event={"ID":"fde481a0-3182-4ad8-90ff-7fc8da0ecde2","Type":"ContainerStarted","Data":"8148eb063ea59c6f89b97ff5a1c5e773b8b9477028f05e22f75ed6ee94846bf3"} Mar 19 15:35:23 crc kubenswrapper[4771]: I0319 15:35:23.079588 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 19 15:35:23 crc kubenswrapper[4771]: I0319 15:35:23.116026 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-7tbkt" podStartSLOduration=2.159761899 podStartE2EDuration="6.116003223s" podCreationTimestamp="2026-03-19 15:35:17 +0000 UTC" firstStartedPulling="2026-03-19 15:35:18.188957955 +0000 UTC m=+1177.417579157" lastFinishedPulling="2026-03-19 15:35:22.145199249 +0000 UTC m=+1181.373820481" observedRunningTime="2026-03-19 15:35:23.090140504 +0000 UTC m=+1182.318761706" watchObservedRunningTime="2026-03-19 15:35:23.116003223 +0000 UTC m=+1182.344624445" Mar 19 15:35:23 crc kubenswrapper[4771]: I0319 15:35:23.169244 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 19 15:35:23 crc kubenswrapper[4771]: I0319 15:35:23.720611 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b5fa-account-create-update-hwb48"] Mar 19 15:35:23 crc kubenswrapper[4771]: E0319 15:35:23.721226 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d146c23d-23c2-480b-b44f-789ba9a1cfd7" containerName="init" Mar 19 15:35:23 crc kubenswrapper[4771]: I0319 15:35:23.721335 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d146c23d-23c2-480b-b44f-789ba9a1cfd7" containerName="init" Mar 19 15:35:23 crc kubenswrapper[4771]: I0319 15:35:23.722099 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d146c23d-23c2-480b-b44f-789ba9a1cfd7" containerName="init" Mar 19 15:35:23 crc kubenswrapper[4771]: I0319 15:35:23.722847 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b5fa-account-create-update-hwb48" Mar 19 15:35:23 crc kubenswrapper[4771]: I0319 15:35:23.729519 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 19 15:35:23 crc kubenswrapper[4771]: I0319 15:35:23.736524 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b5fa-account-create-update-hwb48"] Mar 19 15:35:23 crc kubenswrapper[4771]: I0319 15:35:23.751974 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-hvbsg"] Mar 19 15:35:23 crc kubenswrapper[4771]: I0319 15:35:23.753062 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hvbsg" Mar 19 15:35:23 crc kubenswrapper[4771]: I0319 15:35:23.802795 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-hvbsg"] Mar 19 15:35:23 crc kubenswrapper[4771]: I0319 15:35:23.881886 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c0318a8-d871-42c6-aaa9-4c6f07bb90a8-operator-scripts\") pod \"glance-db-create-hvbsg\" (UID: \"1c0318a8-d871-42c6-aaa9-4c6f07bb90a8\") " pod="openstack/glance-db-create-hvbsg" Mar 19 15:35:23 crc kubenswrapper[4771]: I0319 15:35:23.882203 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28rx9\" (UniqueName: \"kubernetes.io/projected/1c0318a8-d871-42c6-aaa9-4c6f07bb90a8-kube-api-access-28rx9\") pod \"glance-db-create-hvbsg\" (UID: \"1c0318a8-d871-42c6-aaa9-4c6f07bb90a8\") " pod="openstack/glance-db-create-hvbsg" Mar 19 15:35:23 crc kubenswrapper[4771]: I0319 15:35:23.882502 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67ed00e9-ecdb-408d-8ad6-f4272af25922-operator-scripts\") pod \"glance-b5fa-account-create-update-hwb48\" (UID: \"67ed00e9-ecdb-408d-8ad6-f4272af25922\") " pod="openstack/glance-b5fa-account-create-update-hwb48" Mar 19 15:35:23 crc kubenswrapper[4771]: I0319 15:35:23.882546 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq4km\" (UniqueName: \"kubernetes.io/projected/67ed00e9-ecdb-408d-8ad6-f4272af25922-kube-api-access-hq4km\") pod \"glance-b5fa-account-create-update-hwb48\" (UID: \"67ed00e9-ecdb-408d-8ad6-f4272af25922\") " pod="openstack/glance-b5fa-account-create-update-hwb48" Mar 19 15:35:23 crc kubenswrapper[4771]: I0319 15:35:23.983524 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67ed00e9-ecdb-408d-8ad6-f4272af25922-operator-scripts\") pod \"glance-b5fa-account-create-update-hwb48\" (UID: \"67ed00e9-ecdb-408d-8ad6-f4272af25922\") " pod="openstack/glance-b5fa-account-create-update-hwb48" Mar 19 15:35:23 crc kubenswrapper[4771]: I0319 15:35:23.983572 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq4km\" (UniqueName: \"kubernetes.io/projected/67ed00e9-ecdb-408d-8ad6-f4272af25922-kube-api-access-hq4km\") pod \"glance-b5fa-account-create-update-hwb48\" (UID: \"67ed00e9-ecdb-408d-8ad6-f4272af25922\") " pod="openstack/glance-b5fa-account-create-update-hwb48" Mar 19 15:35:23 crc kubenswrapper[4771]: I0319 15:35:23.983605 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c0318a8-d871-42c6-aaa9-4c6f07bb90a8-operator-scripts\") pod \"glance-db-create-hvbsg\" (UID: \"1c0318a8-d871-42c6-aaa9-4c6f07bb90a8\") " pod="openstack/glance-db-create-hvbsg" Mar 19 15:35:23 crc kubenswrapper[4771]: I0319 15:35:23.984585 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c0318a8-d871-42c6-aaa9-4c6f07bb90a8-operator-scripts\") pod \"glance-db-create-hvbsg\" (UID: \"1c0318a8-d871-42c6-aaa9-4c6f07bb90a8\") " pod="openstack/glance-db-create-hvbsg" Mar 19 15:35:23 crc kubenswrapper[4771]: I0319 15:35:23.984790 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28rx9\" (UniqueName: \"kubernetes.io/projected/1c0318a8-d871-42c6-aaa9-4c6f07bb90a8-kube-api-access-28rx9\") pod \"glance-db-create-hvbsg\" (UID: \"1c0318a8-d871-42c6-aaa9-4c6f07bb90a8\") " pod="openstack/glance-db-create-hvbsg" Mar 19 15:35:23 crc kubenswrapper[4771]: I0319 15:35:23.984949 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67ed00e9-ecdb-408d-8ad6-f4272af25922-operator-scripts\") pod \"glance-b5fa-account-create-update-hwb48\" (UID: \"67ed00e9-ecdb-408d-8ad6-f4272af25922\") " pod="openstack/glance-b5fa-account-create-update-hwb48" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.004397 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28rx9\" (UniqueName: \"kubernetes.io/projected/1c0318a8-d871-42c6-aaa9-4c6f07bb90a8-kube-api-access-28rx9\") pod \"glance-db-create-hvbsg\" (UID: \"1c0318a8-d871-42c6-aaa9-4c6f07bb90a8\") " pod="openstack/glance-db-create-hvbsg" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.008727 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq4km\" (UniqueName: \"kubernetes.io/projected/67ed00e9-ecdb-408d-8ad6-f4272af25922-kube-api-access-hq4km\") pod \"glance-b5fa-account-create-update-hwb48\" (UID: \"67ed00e9-ecdb-408d-8ad6-f4272af25922\") " pod="openstack/glance-b5fa-account-create-update-hwb48" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.094305 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b5fa-account-create-update-hwb48" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.114308 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hvbsg" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.330648 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-xtskz"] Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.332483 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xtskz" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.345514 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-xtskz"] Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.400893 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lffgq\" (UniqueName: \"kubernetes.io/projected/9d50a3cd-a539-4f95-b8e1-e157be63cc4d-kube-api-access-lffgq\") pod \"keystone-db-create-xtskz\" (UID: \"9d50a3cd-a539-4f95-b8e1-e157be63cc4d\") " pod="openstack/keystone-db-create-xtskz" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.401003 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d50a3cd-a539-4f95-b8e1-e157be63cc4d-operator-scripts\") pod \"keystone-db-create-xtskz\" (UID: \"9d50a3cd-a539-4f95-b8e1-e157be63cc4d\") " pod="openstack/keystone-db-create-xtskz" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.444097 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7c74-account-create-update-zx8k5"] Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.445707 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7c74-account-create-update-zx8k5" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.447963 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.451122 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7c74-account-create-update-zx8k5"] Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.505379 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d50a3cd-a539-4f95-b8e1-e157be63cc4d-operator-scripts\") pod \"keystone-db-create-xtskz\" (UID: \"9d50a3cd-a539-4f95-b8e1-e157be63cc4d\") " pod="openstack/keystone-db-create-xtskz" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.506015 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lffgq\" (UniqueName: \"kubernetes.io/projected/9d50a3cd-a539-4f95-b8e1-e157be63cc4d-kube-api-access-lffgq\") pod \"keystone-db-create-xtskz\" (UID: \"9d50a3cd-a539-4f95-b8e1-e157be63cc4d\") " pod="openstack/keystone-db-create-xtskz" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.506874 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d50a3cd-a539-4f95-b8e1-e157be63cc4d-operator-scripts\") pod \"keystone-db-create-xtskz\" (UID: \"9d50a3cd-a539-4f95-b8e1-e157be63cc4d\") " pod="openstack/keystone-db-create-xtskz" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.528138 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lffgq\" (UniqueName: \"kubernetes.io/projected/9d50a3cd-a539-4f95-b8e1-e157be63cc4d-kube-api-access-lffgq\") pod \"keystone-db-create-xtskz\" (UID: \"9d50a3cd-a539-4f95-b8e1-e157be63cc4d\") " pod="openstack/keystone-db-create-xtskz" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.557097 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-rqggq"] Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.558163 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rqggq" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.571304 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rqggq"] Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.604285 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b5fa-account-create-update-hwb48"] Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.607894 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/895ea0c2-2780-4e20-8d2d-ed5c378c6cfe-operator-scripts\") pod \"keystone-7c74-account-create-update-zx8k5\" (UID: \"895ea0c2-2780-4e20-8d2d-ed5c378c6cfe\") " pod="openstack/keystone-7c74-account-create-update-zx8k5" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.607971 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67d58e24-649b-4142-a62a-64c9919fe0e4-etc-swift\") pod \"swift-storage-0\" (UID: \"67d58e24-649b-4142-a62a-64c9919fe0e4\") " pod="openstack/swift-storage-0" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.608038 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4srgn\" (UniqueName: \"kubernetes.io/projected/895ea0c2-2780-4e20-8d2d-ed5c378c6cfe-kube-api-access-4srgn\") pod \"keystone-7c74-account-create-update-zx8k5\" (UID: \"895ea0c2-2780-4e20-8d2d-ed5c378c6cfe\") " pod="openstack/keystone-7c74-account-create-update-zx8k5" Mar 19 15:35:24 crc kubenswrapper[4771]: E0319 15:35:24.610084 4771 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 15:35:24 crc kubenswrapper[4771]: E0319 15:35:24.610111 4771 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 15:35:24 crc kubenswrapper[4771]: E0319 15:35:24.610157 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/67d58e24-649b-4142-a62a-64c9919fe0e4-etc-swift podName:67d58e24-649b-4142-a62a-64c9919fe0e4 nodeName:}" failed. No retries permitted until 2026-03-19 15:35:32.610139772 +0000 UTC m=+1191.838761054 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/67d58e24-649b-4142-a62a-64c9919fe0e4-etc-swift") pod "swift-storage-0" (UID: "67d58e24-649b-4142-a62a-64c9919fe0e4") : configmap "swift-ring-files" not found Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.646043 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-70d3-account-create-update-pqjbk"] Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.647122 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-70d3-account-create-update-pqjbk" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.649108 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.659237 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xtskz" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.660605 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-70d3-account-create-update-pqjbk"] Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.683910 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-hvbsg"] Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.709207 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/895ea0c2-2780-4e20-8d2d-ed5c378c6cfe-operator-scripts\") pod \"keystone-7c74-account-create-update-zx8k5\" (UID: \"895ea0c2-2780-4e20-8d2d-ed5c378c6cfe\") " pod="openstack/keystone-7c74-account-create-update-zx8k5" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.709402 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4srgn\" (UniqueName: \"kubernetes.io/projected/895ea0c2-2780-4e20-8d2d-ed5c378c6cfe-kube-api-access-4srgn\") pod \"keystone-7c74-account-create-update-zx8k5\" (UID: \"895ea0c2-2780-4e20-8d2d-ed5c378c6cfe\") " pod="openstack/keystone-7c74-account-create-update-zx8k5" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.709556 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67c93ca3-5940-420d-9ab0-e5a0d2a23964-operator-scripts\") pod \"placement-db-create-rqggq\" (UID: \"67c93ca3-5940-420d-9ab0-e5a0d2a23964\") " pod="openstack/placement-db-create-rqggq" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.709793 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xq5v\" (UniqueName: \"kubernetes.io/projected/67c93ca3-5940-420d-9ab0-e5a0d2a23964-kube-api-access-7xq5v\") pod \"placement-db-create-rqggq\" (UID: \"67c93ca3-5940-420d-9ab0-e5a0d2a23964\") " pod="openstack/placement-db-create-rqggq" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.712974 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/895ea0c2-2780-4e20-8d2d-ed5c378c6cfe-operator-scripts\") pod \"keystone-7c74-account-create-update-zx8k5\" (UID: \"895ea0c2-2780-4e20-8d2d-ed5c378c6cfe\") " pod="openstack/keystone-7c74-account-create-update-zx8k5" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.727593 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4srgn\" (UniqueName: \"kubernetes.io/projected/895ea0c2-2780-4e20-8d2d-ed5c378c6cfe-kube-api-access-4srgn\") pod \"keystone-7c74-account-create-update-zx8k5\" (UID: \"895ea0c2-2780-4e20-8d2d-ed5c378c6cfe\") " pod="openstack/keystone-7c74-account-create-update-zx8k5" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.762324 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7c74-account-create-update-zx8k5" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.763276 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-429lt" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.811527 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67c93ca3-5940-420d-9ab0-e5a0d2a23964-operator-scripts\") pod \"placement-db-create-rqggq\" (UID: \"67c93ca3-5940-420d-9ab0-e5a0d2a23964\") " pod="openstack/placement-db-create-rqggq" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.811576 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/896559b5-ae6f-439f-be12-bcd65a28e5ec-operator-scripts\") pod \"placement-70d3-account-create-update-pqjbk\" (UID: \"896559b5-ae6f-439f-be12-bcd65a28e5ec\") " pod="openstack/placement-70d3-account-create-update-pqjbk" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.811600 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d97nv\" (UniqueName: \"kubernetes.io/projected/896559b5-ae6f-439f-be12-bcd65a28e5ec-kube-api-access-d97nv\") pod \"placement-70d3-account-create-update-pqjbk\" (UID: \"896559b5-ae6f-439f-be12-bcd65a28e5ec\") " pod="openstack/placement-70d3-account-create-update-pqjbk" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.811649 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xq5v\" (UniqueName: \"kubernetes.io/projected/67c93ca3-5940-420d-9ab0-e5a0d2a23964-kube-api-access-7xq5v\") pod \"placement-db-create-rqggq\" (UID: \"67c93ca3-5940-420d-9ab0-e5a0d2a23964\") " pod="openstack/placement-db-create-rqggq" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.812732 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67c93ca3-5940-420d-9ab0-e5a0d2a23964-operator-scripts\") pod \"placement-db-create-rqggq\" (UID: \"67c93ca3-5940-420d-9ab0-e5a0d2a23964\") " pod="openstack/placement-db-create-rqggq" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.830288 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xq5v\" (UniqueName: \"kubernetes.io/projected/67c93ca3-5940-420d-9ab0-e5a0d2a23964-kube-api-access-7xq5v\") pod \"placement-db-create-rqggq\" (UID: \"67c93ca3-5940-420d-9ab0-e5a0d2a23964\") " pod="openstack/placement-db-create-rqggq" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.891534 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rqggq" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.959449 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/896559b5-ae6f-439f-be12-bcd65a28e5ec-operator-scripts\") pod \"placement-70d3-account-create-update-pqjbk\" (UID: \"896559b5-ae6f-439f-be12-bcd65a28e5ec\") " pod="openstack/placement-70d3-account-create-update-pqjbk" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.959492 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d97nv\" (UniqueName: \"kubernetes.io/projected/896559b5-ae6f-439f-be12-bcd65a28e5ec-kube-api-access-d97nv\") pod \"placement-70d3-account-create-update-pqjbk\" (UID: \"896559b5-ae6f-439f-be12-bcd65a28e5ec\") " pod="openstack/placement-70d3-account-create-update-pqjbk" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.962379 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/896559b5-ae6f-439f-be12-bcd65a28e5ec-operator-scripts\") pod \"placement-70d3-account-create-update-pqjbk\" (UID: \"896559b5-ae6f-439f-be12-bcd65a28e5ec\") " pod="openstack/placement-70d3-account-create-update-pqjbk" Mar 19 15:35:24 crc kubenswrapper[4771]: I0319 15:35:24.982720 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d97nv\" (UniqueName: \"kubernetes.io/projected/896559b5-ae6f-439f-be12-bcd65a28e5ec-kube-api-access-d97nv\") pod \"placement-70d3-account-create-update-pqjbk\" (UID: \"896559b5-ae6f-439f-be12-bcd65a28e5ec\") " pod="openstack/placement-70d3-account-create-update-pqjbk" Mar 19 15:35:25 crc kubenswrapper[4771]: I0319 15:35:25.091817 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hvbsg" event={"ID":"1c0318a8-d871-42c6-aaa9-4c6f07bb90a8","Type":"ContainerStarted","Data":"9c968be97b23a148cb4880311d94d2197a702db8f8dbcba97617562c1b1fca6e"} Mar 19 15:35:25 crc kubenswrapper[4771]: I0319 15:35:25.093027 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5fa-account-create-update-hwb48" event={"ID":"67ed00e9-ecdb-408d-8ad6-f4272af25922","Type":"ContainerStarted","Data":"d27a4c33921e1ea916aabd0a1fae3522cc1c5e62ab274188706034babee21b08"} Mar 19 15:35:25 crc kubenswrapper[4771]: I0319 15:35:25.118009 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-xtskz"] Mar 19 15:35:25 crc kubenswrapper[4771]: W0319 15:35:25.135589 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d50a3cd_a539_4f95_b8e1_e157be63cc4d.slice/crio-70f24d0b4843ca130acbcaec4ca49a8de604c4736ebeb020e112abea91eff696 WatchSource:0}: Error finding container 70f24d0b4843ca130acbcaec4ca49a8de604c4736ebeb020e112abea91eff696: Status 404 returned error can't find the container with id 70f24d0b4843ca130acbcaec4ca49a8de604c4736ebeb020e112abea91eff696 Mar 19 15:35:25 crc kubenswrapper[4771]: I0319 15:35:25.269153 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-70d3-account-create-update-pqjbk" Mar 19 15:35:25 crc kubenswrapper[4771]: I0319 15:35:25.310755 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7c74-account-create-update-zx8k5"] Mar 19 15:35:25 crc kubenswrapper[4771]: W0319 15:35:25.350105 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod895ea0c2_2780_4e20_8d2d_ed5c378c6cfe.slice/crio-7a1b0871e2d7044d0dcff96e2917920fa6989fac91ce66b6c11593d6aebb2074 WatchSource:0}: Error finding container 7a1b0871e2d7044d0dcff96e2917920fa6989fac91ce66b6c11593d6aebb2074: Status 404 returned error can't find the container with id 7a1b0871e2d7044d0dcff96e2917920fa6989fac91ce66b6c11593d6aebb2074 Mar 19 15:35:25 crc kubenswrapper[4771]: I0319 15:35:25.420311 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rqggq"] Mar 19 15:35:25 crc kubenswrapper[4771]: W0319 15:35:25.434599 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67c93ca3_5940_420d_9ab0_e5a0d2a23964.slice/crio-14b4f069c100565e0570016892bca99fad2515543cff9dffcad066569da675b7 WatchSource:0}: Error finding container 14b4f069c100565e0570016892bca99fad2515543cff9dffcad066569da675b7: Status 404 returned error can't find the container with id 14b4f069c100565e0570016892bca99fad2515543cff9dffcad066569da675b7 Mar 19 15:35:25 crc kubenswrapper[4771]: I0319 15:35:25.717948 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-70d3-account-create-update-pqjbk"] Mar 19 15:35:25 crc kubenswrapper[4771]: W0319 15:35:25.721958 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod896559b5_ae6f_439f_be12_bcd65a28e5ec.slice/crio-50a72be3b1ba29f26edcae249ad9e0aaa966f16b819678288862a48f0b35c410 WatchSource:0}: Error finding container 50a72be3b1ba29f26edcae249ad9e0aaa966f16b819678288862a48f0b35c410: Status 404 returned error can't find the container with id 50a72be3b1ba29f26edcae249ad9e0aaa966f16b819678288862a48f0b35c410 Mar 19 15:35:25 crc kubenswrapper[4771]: I0319 15:35:25.948131 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-p67w5" Mar 19 15:35:25 crc kubenswrapper[4771]: I0319 15:35:25.999483 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-429lt"] Mar 19 15:35:25 crc kubenswrapper[4771]: I0319 15:35:25.999713 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-429lt" podUID="8ab31eac-215d-4527-80f1-68ef6224e8ed" containerName="dnsmasq-dns" containerID="cri-o://6a719c8b560f195535b728f0b5999dbb13e68daf4c4500c9840fe96e294fa2d2" gracePeriod=10 Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.104316 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xtskz" event={"ID":"9d50a3cd-a539-4f95-b8e1-e157be63cc4d","Type":"ContainerStarted","Data":"a56201bae3bbf956178a4286e88c593e79f87fbd98eb487de2945489ff26e3f8"} Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.104353 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xtskz" event={"ID":"9d50a3cd-a539-4f95-b8e1-e157be63cc4d","Type":"ContainerStarted","Data":"70f24d0b4843ca130acbcaec4ca49a8de604c4736ebeb020e112abea91eff696"} Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.108155 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5fa-account-create-update-hwb48" event={"ID":"67ed00e9-ecdb-408d-8ad6-f4272af25922","Type":"ContainerStarted","Data":"e1cf2fd7d2344c2ed37b1816e6988dbb86f5d037a89f5e51850bb0881f5a0925"} Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.109463 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7c74-account-create-update-zx8k5" event={"ID":"895ea0c2-2780-4e20-8d2d-ed5c378c6cfe","Type":"ContainerStarted","Data":"39d69706c94058aae53342467ca2ae2a0f6db10d66bda8e28d0593f17c5ed487"} Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.109508 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7c74-account-create-update-zx8k5" event={"ID":"895ea0c2-2780-4e20-8d2d-ed5c378c6cfe","Type":"ContainerStarted","Data":"7a1b0871e2d7044d0dcff96e2917920fa6989fac91ce66b6c11593d6aebb2074"} Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.112018 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-70d3-account-create-update-pqjbk" event={"ID":"896559b5-ae6f-439f-be12-bcd65a28e5ec","Type":"ContainerStarted","Data":"7c2b2c5f26acff33e5f2bb0f327430855d0d7081af1251d352873b7e0e34f4e9"} Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.112085 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-70d3-account-create-update-pqjbk" event={"ID":"896559b5-ae6f-439f-be12-bcd65a28e5ec","Type":"ContainerStarted","Data":"50a72be3b1ba29f26edcae249ad9e0aaa966f16b819678288862a48f0b35c410"} Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.113385 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hvbsg" event={"ID":"1c0318a8-d871-42c6-aaa9-4c6f07bb90a8","Type":"ContainerStarted","Data":"5e67df495d952d972b57081b28e98d502571f5e99ab8ee5afb6fc36ae4b5177f"} Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.115595 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rqggq" event={"ID":"67c93ca3-5940-420d-9ab0-e5a0d2a23964","Type":"ContainerStarted","Data":"ba02326d3d8570c40017025d4d0aa13fbbc2b051c78a6e12a88e9d872e0dc123"} Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.115620 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rqggq" event={"ID":"67c93ca3-5940-420d-9ab0-e5a0d2a23964","Type":"ContainerStarted","Data":"14b4f069c100565e0570016892bca99fad2515543cff9dffcad066569da675b7"} Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.134358 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-xtskz" podStartSLOduration=2.134338281 podStartE2EDuration="2.134338281s" podCreationTimestamp="2026-03-19 15:35:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:35:26.12717346 +0000 UTC m=+1185.355794662" watchObservedRunningTime="2026-03-19 15:35:26.134338281 +0000 UTC m=+1185.362959483" Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.147581 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-70d3-account-create-update-pqjbk" podStartSLOduration=2.147560208 podStartE2EDuration="2.147560208s" podCreationTimestamp="2026-03-19 15:35:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:35:26.146352119 +0000 UTC m=+1185.374973321" watchObservedRunningTime="2026-03-19 15:35:26.147560208 +0000 UTC m=+1185.376181420" Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.181366 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-b5fa-account-create-update-hwb48" podStartSLOduration=3.181339747 podStartE2EDuration="3.181339747s" podCreationTimestamp="2026-03-19 15:35:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:35:26.167073635 +0000 UTC m=+1185.395694837" watchObservedRunningTime="2026-03-19 15:35:26.181339747 +0000 UTC m=+1185.409960959" Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.188966 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-hvbsg" podStartSLOduration=3.188942769 podStartE2EDuration="3.188942769s" podCreationTimestamp="2026-03-19 15:35:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:35:26.181260335 +0000 UTC m=+1185.409881537" watchObservedRunningTime="2026-03-19 15:35:26.188942769 +0000 UTC m=+1185.417563971" Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.208792 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7c74-account-create-update-zx8k5" podStartSLOduration=2.208767524 podStartE2EDuration="2.208767524s" podCreationTimestamp="2026-03-19 15:35:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:35:26.199039881 +0000 UTC m=+1185.427661093" watchObservedRunningTime="2026-03-19 15:35:26.208767524 +0000 UTC m=+1185.437388726" Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.231360 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-rqggq" podStartSLOduration=2.231332114 podStartE2EDuration="2.231332114s" podCreationTimestamp="2026-03-19 15:35:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:35:26.221210672 +0000 UTC m=+1185.449831874" watchObservedRunningTime="2026-03-19 15:35:26.231332114 +0000 UTC m=+1185.459953316" Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.524739 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-429lt" Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.593619 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ab31eac-215d-4527-80f1-68ef6224e8ed-dns-svc\") pod \"8ab31eac-215d-4527-80f1-68ef6224e8ed\" (UID: \"8ab31eac-215d-4527-80f1-68ef6224e8ed\") " Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.593782 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ab31eac-215d-4527-80f1-68ef6224e8ed-config\") pod \"8ab31eac-215d-4527-80f1-68ef6224e8ed\" (UID: \"8ab31eac-215d-4527-80f1-68ef6224e8ed\") " Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.593863 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ab31eac-215d-4527-80f1-68ef6224e8ed-ovsdbserver-sb\") pod \"8ab31eac-215d-4527-80f1-68ef6224e8ed\" (UID: \"8ab31eac-215d-4527-80f1-68ef6224e8ed\") " Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.593969 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgtb8\" (UniqueName: \"kubernetes.io/projected/8ab31eac-215d-4527-80f1-68ef6224e8ed-kube-api-access-mgtb8\") pod \"8ab31eac-215d-4527-80f1-68ef6224e8ed\" (UID: \"8ab31eac-215d-4527-80f1-68ef6224e8ed\") " Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.594085 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ab31eac-215d-4527-80f1-68ef6224e8ed-ovsdbserver-nb\") pod \"8ab31eac-215d-4527-80f1-68ef6224e8ed\" (UID: \"8ab31eac-215d-4527-80f1-68ef6224e8ed\") " Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.601600 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ab31eac-215d-4527-80f1-68ef6224e8ed-kube-api-access-mgtb8" (OuterVolumeSpecName: "kube-api-access-mgtb8") pod "8ab31eac-215d-4527-80f1-68ef6224e8ed" (UID: "8ab31eac-215d-4527-80f1-68ef6224e8ed"). InnerVolumeSpecName "kube-api-access-mgtb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.641150 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ab31eac-215d-4527-80f1-68ef6224e8ed-config" (OuterVolumeSpecName: "config") pod "8ab31eac-215d-4527-80f1-68ef6224e8ed" (UID: "8ab31eac-215d-4527-80f1-68ef6224e8ed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.648057 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ab31eac-215d-4527-80f1-68ef6224e8ed-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8ab31eac-215d-4527-80f1-68ef6224e8ed" (UID: "8ab31eac-215d-4527-80f1-68ef6224e8ed"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.655538 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ab31eac-215d-4527-80f1-68ef6224e8ed-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8ab31eac-215d-4527-80f1-68ef6224e8ed" (UID: "8ab31eac-215d-4527-80f1-68ef6224e8ed"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.686413 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ab31eac-215d-4527-80f1-68ef6224e8ed-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8ab31eac-215d-4527-80f1-68ef6224e8ed" (UID: "8ab31eac-215d-4527-80f1-68ef6224e8ed"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.695562 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ab31eac-215d-4527-80f1-68ef6224e8ed-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.695589 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgtb8\" (UniqueName: \"kubernetes.io/projected/8ab31eac-215d-4527-80f1-68ef6224e8ed-kube-api-access-mgtb8\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.695600 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ab31eac-215d-4527-80f1-68ef6224e8ed-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.695609 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ab31eac-215d-4527-80f1-68ef6224e8ed-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:26 crc kubenswrapper[4771]: I0319 15:35:26.695618 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ab31eac-215d-4527-80f1-68ef6224e8ed-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:27 crc kubenswrapper[4771]: I0319 15:35:27.126697 4771 generic.go:334] "Generic (PLEG): container finished" podID="67ed00e9-ecdb-408d-8ad6-f4272af25922" containerID="e1cf2fd7d2344c2ed37b1816e6988dbb86f5d037a89f5e51850bb0881f5a0925" exitCode=0 Mar 19 15:35:27 crc kubenswrapper[4771]: I0319 15:35:27.126821 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5fa-account-create-update-hwb48" event={"ID":"67ed00e9-ecdb-408d-8ad6-f4272af25922","Type":"ContainerDied","Data":"e1cf2fd7d2344c2ed37b1816e6988dbb86f5d037a89f5e51850bb0881f5a0925"} Mar 19 15:35:27 crc kubenswrapper[4771]: I0319 15:35:27.128839 4771 generic.go:334] "Generic (PLEG): container finished" podID="895ea0c2-2780-4e20-8d2d-ed5c378c6cfe" containerID="39d69706c94058aae53342467ca2ae2a0f6db10d66bda8e28d0593f17c5ed487" exitCode=0 Mar 19 15:35:27 crc kubenswrapper[4771]: I0319 15:35:27.128925 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7c74-account-create-update-zx8k5" event={"ID":"895ea0c2-2780-4e20-8d2d-ed5c378c6cfe","Type":"ContainerDied","Data":"39d69706c94058aae53342467ca2ae2a0f6db10d66bda8e28d0593f17c5ed487"} Mar 19 15:35:27 crc kubenswrapper[4771]: I0319 15:35:27.131127 4771 generic.go:334] "Generic (PLEG): container finished" podID="896559b5-ae6f-439f-be12-bcd65a28e5ec" containerID="7c2b2c5f26acff33e5f2bb0f327430855d0d7081af1251d352873b7e0e34f4e9" exitCode=0 Mar 19 15:35:27 crc kubenswrapper[4771]: I0319 15:35:27.131211 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-70d3-account-create-update-pqjbk" event={"ID":"896559b5-ae6f-439f-be12-bcd65a28e5ec","Type":"ContainerDied","Data":"7c2b2c5f26acff33e5f2bb0f327430855d0d7081af1251d352873b7e0e34f4e9"} Mar 19 15:35:27 crc kubenswrapper[4771]: I0319 15:35:27.133380 4771 generic.go:334] "Generic (PLEG): container finished" podID="8ab31eac-215d-4527-80f1-68ef6224e8ed" containerID="6a719c8b560f195535b728f0b5999dbb13e68daf4c4500c9840fe96e294fa2d2" exitCode=0 Mar 19 15:35:27 crc kubenswrapper[4771]: I0319 15:35:27.133657 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-429lt" event={"ID":"8ab31eac-215d-4527-80f1-68ef6224e8ed","Type":"ContainerDied","Data":"6a719c8b560f195535b728f0b5999dbb13e68daf4c4500c9840fe96e294fa2d2"} Mar 19 15:35:27 crc kubenswrapper[4771]: I0319 15:35:27.133711 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-429lt" event={"ID":"8ab31eac-215d-4527-80f1-68ef6224e8ed","Type":"ContainerDied","Data":"0296ce5145112712e2ab33811536c4b587381f31b88a5b7f6f43723d50e645d4"} Mar 19 15:35:27 crc kubenswrapper[4771]: I0319 15:35:27.133741 4771 scope.go:117] "RemoveContainer" containerID="6a719c8b560f195535b728f0b5999dbb13e68daf4c4500c9840fe96e294fa2d2" Mar 19 15:35:27 crc kubenswrapper[4771]: I0319 15:35:27.133761 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-429lt" Mar 19 15:35:27 crc kubenswrapper[4771]: I0319 15:35:27.136115 4771 generic.go:334] "Generic (PLEG): container finished" podID="1c0318a8-d871-42c6-aaa9-4c6f07bb90a8" containerID="5e67df495d952d972b57081b28e98d502571f5e99ab8ee5afb6fc36ae4b5177f" exitCode=0 Mar 19 15:35:27 crc kubenswrapper[4771]: I0319 15:35:27.136161 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hvbsg" event={"ID":"1c0318a8-d871-42c6-aaa9-4c6f07bb90a8","Type":"ContainerDied","Data":"5e67df495d952d972b57081b28e98d502571f5e99ab8ee5afb6fc36ae4b5177f"} Mar 19 15:35:27 crc kubenswrapper[4771]: I0319 15:35:27.141375 4771 generic.go:334] "Generic (PLEG): container finished" podID="67c93ca3-5940-420d-9ab0-e5a0d2a23964" containerID="ba02326d3d8570c40017025d4d0aa13fbbc2b051c78a6e12a88e9d872e0dc123" exitCode=0 Mar 19 15:35:27 crc kubenswrapper[4771]: I0319 15:35:27.141458 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rqggq" event={"ID":"67c93ca3-5940-420d-9ab0-e5a0d2a23964","Type":"ContainerDied","Data":"ba02326d3d8570c40017025d4d0aa13fbbc2b051c78a6e12a88e9d872e0dc123"} Mar 19 15:35:27 crc kubenswrapper[4771]: I0319 15:35:27.143515 4771 generic.go:334] "Generic (PLEG): container finished" podID="9d50a3cd-a539-4f95-b8e1-e157be63cc4d" containerID="a56201bae3bbf956178a4286e88c593e79f87fbd98eb487de2945489ff26e3f8" exitCode=0 Mar 19 15:35:27 crc kubenswrapper[4771]: I0319 15:35:27.143560 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xtskz" event={"ID":"9d50a3cd-a539-4f95-b8e1-e157be63cc4d","Type":"ContainerDied","Data":"a56201bae3bbf956178a4286e88c593e79f87fbd98eb487de2945489ff26e3f8"} Mar 19 15:35:27 crc kubenswrapper[4771]: I0319 15:35:27.161366 4771 scope.go:117] "RemoveContainer" containerID="e0a154669c3c438ad194dfd8322f38a7d6df72af139a76e388a29dd9d2994add" Mar 19 15:35:27 crc kubenswrapper[4771]: I0319 15:35:27.192279 4771 scope.go:117] "RemoveContainer" containerID="6a719c8b560f195535b728f0b5999dbb13e68daf4c4500c9840fe96e294fa2d2" Mar 19 15:35:27 crc kubenswrapper[4771]: E0319 15:35:27.192891 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a719c8b560f195535b728f0b5999dbb13e68daf4c4500c9840fe96e294fa2d2\": container with ID starting with 6a719c8b560f195535b728f0b5999dbb13e68daf4c4500c9840fe96e294fa2d2 not found: ID does not exist" containerID="6a719c8b560f195535b728f0b5999dbb13e68daf4c4500c9840fe96e294fa2d2" Mar 19 15:35:27 crc kubenswrapper[4771]: I0319 15:35:27.192954 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a719c8b560f195535b728f0b5999dbb13e68daf4c4500c9840fe96e294fa2d2"} err="failed to get container status \"6a719c8b560f195535b728f0b5999dbb13e68daf4c4500c9840fe96e294fa2d2\": rpc error: code = NotFound desc = could not find container \"6a719c8b560f195535b728f0b5999dbb13e68daf4c4500c9840fe96e294fa2d2\": container with ID starting with 6a719c8b560f195535b728f0b5999dbb13e68daf4c4500c9840fe96e294fa2d2 not found: ID does not exist" Mar 19 15:35:27 crc kubenswrapper[4771]: I0319 15:35:27.192980 4771 scope.go:117] "RemoveContainer" containerID="e0a154669c3c438ad194dfd8322f38a7d6df72af139a76e388a29dd9d2994add" Mar 19 15:35:27 crc kubenswrapper[4771]: E0319 15:35:27.193457 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0a154669c3c438ad194dfd8322f38a7d6df72af139a76e388a29dd9d2994add\": container with ID starting with e0a154669c3c438ad194dfd8322f38a7d6df72af139a76e388a29dd9d2994add not found: ID does not exist" containerID="e0a154669c3c438ad194dfd8322f38a7d6df72af139a76e388a29dd9d2994add" Mar 19 15:35:27 crc kubenswrapper[4771]: I0319 15:35:27.193504 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0a154669c3c438ad194dfd8322f38a7d6df72af139a76e388a29dd9d2994add"} err="failed to get container status \"e0a154669c3c438ad194dfd8322f38a7d6df72af139a76e388a29dd9d2994add\": rpc error: code = NotFound desc = could not find container \"e0a154669c3c438ad194dfd8322f38a7d6df72af139a76e388a29dd9d2994add\": container with ID starting with e0a154669c3c438ad194dfd8322f38a7d6df72af139a76e388a29dd9d2994add not found: ID does not exist" Mar 19 15:35:27 crc kubenswrapper[4771]: I0319 15:35:27.270220 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-429lt"] Mar 19 15:35:27 crc kubenswrapper[4771]: I0319 15:35:27.276218 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-429lt"] Mar 19 15:35:27 crc kubenswrapper[4771]: I0319 15:35:27.525135 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ab31eac-215d-4527-80f1-68ef6224e8ed" path="/var/lib/kubelet/pods/8ab31eac-215d-4527-80f1-68ef6224e8ed/volumes" Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.625328 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b5fa-account-create-update-hwb48" Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.663646 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hq4km\" (UniqueName: \"kubernetes.io/projected/67ed00e9-ecdb-408d-8ad6-f4272af25922-kube-api-access-hq4km\") pod \"67ed00e9-ecdb-408d-8ad6-f4272af25922\" (UID: \"67ed00e9-ecdb-408d-8ad6-f4272af25922\") " Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.663991 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67ed00e9-ecdb-408d-8ad6-f4272af25922-operator-scripts\") pod \"67ed00e9-ecdb-408d-8ad6-f4272af25922\" (UID: \"67ed00e9-ecdb-408d-8ad6-f4272af25922\") " Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.665345 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67ed00e9-ecdb-408d-8ad6-f4272af25922-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "67ed00e9-ecdb-408d-8ad6-f4272af25922" (UID: "67ed00e9-ecdb-408d-8ad6-f4272af25922"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.672265 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67ed00e9-ecdb-408d-8ad6-f4272af25922-kube-api-access-hq4km" (OuterVolumeSpecName: "kube-api-access-hq4km") pod "67ed00e9-ecdb-408d-8ad6-f4272af25922" (UID: "67ed00e9-ecdb-408d-8ad6-f4272af25922"). InnerVolumeSpecName "kube-api-access-hq4km". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.765646 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hq4km\" (UniqueName: \"kubernetes.io/projected/67ed00e9-ecdb-408d-8ad6-f4272af25922-kube-api-access-hq4km\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.765684 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67ed00e9-ecdb-408d-8ad6-f4272af25922-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.783221 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-70d3-account-create-update-pqjbk" Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.794976 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xtskz" Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.803497 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rqggq" Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.821938 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hvbsg" Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.840446 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7c74-account-create-update-zx8k5" Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.868696 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67c93ca3-5940-420d-9ab0-e5a0d2a23964-operator-scripts\") pod \"67c93ca3-5940-420d-9ab0-e5a0d2a23964\" (UID: \"67c93ca3-5940-420d-9ab0-e5a0d2a23964\") " Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.868758 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xq5v\" (UniqueName: \"kubernetes.io/projected/67c93ca3-5940-420d-9ab0-e5a0d2a23964-kube-api-access-7xq5v\") pod \"67c93ca3-5940-420d-9ab0-e5a0d2a23964\" (UID: \"67c93ca3-5940-420d-9ab0-e5a0d2a23964\") " Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.868829 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c0318a8-d871-42c6-aaa9-4c6f07bb90a8-operator-scripts\") pod \"1c0318a8-d871-42c6-aaa9-4c6f07bb90a8\" (UID: \"1c0318a8-d871-42c6-aaa9-4c6f07bb90a8\") " Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.868875 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28rx9\" (UniqueName: \"kubernetes.io/projected/1c0318a8-d871-42c6-aaa9-4c6f07bb90a8-kube-api-access-28rx9\") pod \"1c0318a8-d871-42c6-aaa9-4c6f07bb90a8\" (UID: \"1c0318a8-d871-42c6-aaa9-4c6f07bb90a8\") " Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.868914 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d97nv\" (UniqueName: \"kubernetes.io/projected/896559b5-ae6f-439f-be12-bcd65a28e5ec-kube-api-access-d97nv\") pod \"896559b5-ae6f-439f-be12-bcd65a28e5ec\" (UID: \"896559b5-ae6f-439f-be12-bcd65a28e5ec\") " Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.868969 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lffgq\" (UniqueName: \"kubernetes.io/projected/9d50a3cd-a539-4f95-b8e1-e157be63cc4d-kube-api-access-lffgq\") pod \"9d50a3cd-a539-4f95-b8e1-e157be63cc4d\" (UID: \"9d50a3cd-a539-4f95-b8e1-e157be63cc4d\") " Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.869045 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d50a3cd-a539-4f95-b8e1-e157be63cc4d-operator-scripts\") pod \"9d50a3cd-a539-4f95-b8e1-e157be63cc4d\" (UID: \"9d50a3cd-a539-4f95-b8e1-e157be63cc4d\") " Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.869115 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/896559b5-ae6f-439f-be12-bcd65a28e5ec-operator-scripts\") pod \"896559b5-ae6f-439f-be12-bcd65a28e5ec\" (UID: \"896559b5-ae6f-439f-be12-bcd65a28e5ec\") " Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.870173 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/896559b5-ae6f-439f-be12-bcd65a28e5ec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "896559b5-ae6f-439f-be12-bcd65a28e5ec" (UID: "896559b5-ae6f-439f-be12-bcd65a28e5ec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.870585 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67c93ca3-5940-420d-9ab0-e5a0d2a23964-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "67c93ca3-5940-420d-9ab0-e5a0d2a23964" (UID: "67c93ca3-5940-420d-9ab0-e5a0d2a23964"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.873905 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67c93ca3-5940-420d-9ab0-e5a0d2a23964-kube-api-access-7xq5v" (OuterVolumeSpecName: "kube-api-access-7xq5v") pod "67c93ca3-5940-420d-9ab0-e5a0d2a23964" (UID: "67c93ca3-5940-420d-9ab0-e5a0d2a23964"). InnerVolumeSpecName "kube-api-access-7xq5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.875257 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c0318a8-d871-42c6-aaa9-4c6f07bb90a8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1c0318a8-d871-42c6-aaa9-4c6f07bb90a8" (UID: "1c0318a8-d871-42c6-aaa9-4c6f07bb90a8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.875684 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d50a3cd-a539-4f95-b8e1-e157be63cc4d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9d50a3cd-a539-4f95-b8e1-e157be63cc4d" (UID: "9d50a3cd-a539-4f95-b8e1-e157be63cc4d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.877018 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d50a3cd-a539-4f95-b8e1-e157be63cc4d-kube-api-access-lffgq" (OuterVolumeSpecName: "kube-api-access-lffgq") pod "9d50a3cd-a539-4f95-b8e1-e157be63cc4d" (UID: "9d50a3cd-a539-4f95-b8e1-e157be63cc4d"). InnerVolumeSpecName "kube-api-access-lffgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.877903 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c0318a8-d871-42c6-aaa9-4c6f07bb90a8-kube-api-access-28rx9" (OuterVolumeSpecName: "kube-api-access-28rx9") pod "1c0318a8-d871-42c6-aaa9-4c6f07bb90a8" (UID: "1c0318a8-d871-42c6-aaa9-4c6f07bb90a8"). InnerVolumeSpecName "kube-api-access-28rx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.879260 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/896559b5-ae6f-439f-be12-bcd65a28e5ec-kube-api-access-d97nv" (OuterVolumeSpecName: "kube-api-access-d97nv") pod "896559b5-ae6f-439f-be12-bcd65a28e5ec" (UID: "896559b5-ae6f-439f-be12-bcd65a28e5ec"). InnerVolumeSpecName "kube-api-access-d97nv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.974774 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4srgn\" (UniqueName: \"kubernetes.io/projected/895ea0c2-2780-4e20-8d2d-ed5c378c6cfe-kube-api-access-4srgn\") pod \"895ea0c2-2780-4e20-8d2d-ed5c378c6cfe\" (UID: \"895ea0c2-2780-4e20-8d2d-ed5c378c6cfe\") " Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.974928 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/895ea0c2-2780-4e20-8d2d-ed5c378c6cfe-operator-scripts\") pod \"895ea0c2-2780-4e20-8d2d-ed5c378c6cfe\" (UID: \"895ea0c2-2780-4e20-8d2d-ed5c378c6cfe\") " Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.975531 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c0318a8-d871-42c6-aaa9-4c6f07bb90a8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.975549 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28rx9\" (UniqueName: \"kubernetes.io/projected/1c0318a8-d871-42c6-aaa9-4c6f07bb90a8-kube-api-access-28rx9\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.975564 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d97nv\" (UniqueName: \"kubernetes.io/projected/896559b5-ae6f-439f-be12-bcd65a28e5ec-kube-api-access-d97nv\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.975575 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lffgq\" (UniqueName: \"kubernetes.io/projected/9d50a3cd-a539-4f95-b8e1-e157be63cc4d-kube-api-access-lffgq\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.975586 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d50a3cd-a539-4f95-b8e1-e157be63cc4d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.975597 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/896559b5-ae6f-439f-be12-bcd65a28e5ec-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.975608 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67c93ca3-5940-420d-9ab0-e5a0d2a23964-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.975618 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xq5v\" (UniqueName: \"kubernetes.io/projected/67c93ca3-5940-420d-9ab0-e5a0d2a23964-kube-api-access-7xq5v\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.976097 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/895ea0c2-2780-4e20-8d2d-ed5c378c6cfe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "895ea0c2-2780-4e20-8d2d-ed5c378c6cfe" (UID: "895ea0c2-2780-4e20-8d2d-ed5c378c6cfe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:35:28 crc kubenswrapper[4771]: I0319 15:35:28.979251 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/895ea0c2-2780-4e20-8d2d-ed5c378c6cfe-kube-api-access-4srgn" (OuterVolumeSpecName: "kube-api-access-4srgn") pod "895ea0c2-2780-4e20-8d2d-ed5c378c6cfe" (UID: "895ea0c2-2780-4e20-8d2d-ed5c378c6cfe"). InnerVolumeSpecName "kube-api-access-4srgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:35:29 crc kubenswrapper[4771]: I0319 15:35:29.078058 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4srgn\" (UniqueName: \"kubernetes.io/projected/895ea0c2-2780-4e20-8d2d-ed5c378c6cfe-kube-api-access-4srgn\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:29 crc kubenswrapper[4771]: I0319 15:35:29.078119 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/895ea0c2-2780-4e20-8d2d-ed5c378c6cfe-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:29 crc kubenswrapper[4771]: I0319 15:35:29.167442 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rqggq" event={"ID":"67c93ca3-5940-420d-9ab0-e5a0d2a23964","Type":"ContainerDied","Data":"14b4f069c100565e0570016892bca99fad2515543cff9dffcad066569da675b7"} Mar 19 15:35:29 crc kubenswrapper[4771]: I0319 15:35:29.167486 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14b4f069c100565e0570016892bca99fad2515543cff9dffcad066569da675b7" Mar 19 15:35:29 crc kubenswrapper[4771]: I0319 15:35:29.167519 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rqggq" Mar 19 15:35:29 crc kubenswrapper[4771]: I0319 15:35:29.170130 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xtskz" event={"ID":"9d50a3cd-a539-4f95-b8e1-e157be63cc4d","Type":"ContainerDied","Data":"70f24d0b4843ca130acbcaec4ca49a8de604c4736ebeb020e112abea91eff696"} Mar 19 15:35:29 crc kubenswrapper[4771]: I0319 15:35:29.170168 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70f24d0b4843ca130acbcaec4ca49a8de604c4736ebeb020e112abea91eff696" Mar 19 15:35:29 crc kubenswrapper[4771]: I0319 15:35:29.170224 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xtskz" Mar 19 15:35:29 crc kubenswrapper[4771]: I0319 15:35:29.171746 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5fa-account-create-update-hwb48" event={"ID":"67ed00e9-ecdb-408d-8ad6-f4272af25922","Type":"ContainerDied","Data":"d27a4c33921e1ea916aabd0a1fae3522cc1c5e62ab274188706034babee21b08"} Mar 19 15:35:29 crc kubenswrapper[4771]: I0319 15:35:29.171771 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d27a4c33921e1ea916aabd0a1fae3522cc1c5e62ab274188706034babee21b08" Mar 19 15:35:29 crc kubenswrapper[4771]: I0319 15:35:29.171824 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b5fa-account-create-update-hwb48" Mar 19 15:35:29 crc kubenswrapper[4771]: I0319 15:35:29.173982 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7c74-account-create-update-zx8k5" Mar 19 15:35:29 crc kubenswrapper[4771]: I0319 15:35:29.173992 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7c74-account-create-update-zx8k5" event={"ID":"895ea0c2-2780-4e20-8d2d-ed5c378c6cfe","Type":"ContainerDied","Data":"7a1b0871e2d7044d0dcff96e2917920fa6989fac91ce66b6c11593d6aebb2074"} Mar 19 15:35:29 crc kubenswrapper[4771]: I0319 15:35:29.174076 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a1b0871e2d7044d0dcff96e2917920fa6989fac91ce66b6c11593d6aebb2074" Mar 19 15:35:29 crc kubenswrapper[4771]: I0319 15:35:29.177821 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-70d3-account-create-update-pqjbk" event={"ID":"896559b5-ae6f-439f-be12-bcd65a28e5ec","Type":"ContainerDied","Data":"50a72be3b1ba29f26edcae249ad9e0aaa966f16b819678288862a48f0b35c410"} Mar 19 15:35:29 crc kubenswrapper[4771]: I0319 15:35:29.177858 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-70d3-account-create-update-pqjbk" Mar 19 15:35:29 crc kubenswrapper[4771]: I0319 15:35:29.177877 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50a72be3b1ba29f26edcae249ad9e0aaa966f16b819678288862a48f0b35c410" Mar 19 15:35:29 crc kubenswrapper[4771]: I0319 15:35:29.180794 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hvbsg" event={"ID":"1c0318a8-d871-42c6-aaa9-4c6f07bb90a8","Type":"ContainerDied","Data":"9c968be97b23a148cb4880311d94d2197a702db8f8dbcba97617562c1b1fca6e"} Mar 19 15:35:29 crc kubenswrapper[4771]: I0319 15:35:29.180847 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hvbsg" Mar 19 15:35:29 crc kubenswrapper[4771]: I0319 15:35:29.180863 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c968be97b23a148cb4880311d94d2197a702db8f8dbcba97617562c1b1fca6e" Mar 19 15:35:30 crc kubenswrapper[4771]: I0319 15:35:30.344086 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-tdqvp"] Mar 19 15:35:30 crc kubenswrapper[4771]: E0319 15:35:30.344610 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ab31eac-215d-4527-80f1-68ef6224e8ed" containerName="init" Mar 19 15:35:30 crc kubenswrapper[4771]: I0319 15:35:30.344622 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab31eac-215d-4527-80f1-68ef6224e8ed" containerName="init" Mar 19 15:35:30 crc kubenswrapper[4771]: E0319 15:35:30.344635 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="896559b5-ae6f-439f-be12-bcd65a28e5ec" containerName="mariadb-account-create-update" Mar 19 15:35:30 crc kubenswrapper[4771]: I0319 15:35:30.344641 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="896559b5-ae6f-439f-be12-bcd65a28e5ec" containerName="mariadb-account-create-update" Mar 19 15:35:30 crc kubenswrapper[4771]: E0319 15:35:30.344659 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ab31eac-215d-4527-80f1-68ef6224e8ed" containerName="dnsmasq-dns" Mar 19 15:35:30 crc kubenswrapper[4771]: I0319 15:35:30.344664 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab31eac-215d-4527-80f1-68ef6224e8ed" containerName="dnsmasq-dns" Mar 19 15:35:30 crc kubenswrapper[4771]: E0319 15:35:30.344674 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d50a3cd-a539-4f95-b8e1-e157be63cc4d" containerName="mariadb-database-create" Mar 19 15:35:30 crc kubenswrapper[4771]: I0319 15:35:30.344680 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d50a3cd-a539-4f95-b8e1-e157be63cc4d" containerName="mariadb-database-create" Mar 19 15:35:30 crc kubenswrapper[4771]: E0319 15:35:30.344690 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c0318a8-d871-42c6-aaa9-4c6f07bb90a8" containerName="mariadb-database-create" Mar 19 15:35:30 crc kubenswrapper[4771]: I0319 15:35:30.344695 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c0318a8-d871-42c6-aaa9-4c6f07bb90a8" containerName="mariadb-database-create" Mar 19 15:35:30 crc kubenswrapper[4771]: E0319 15:35:30.344707 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67ed00e9-ecdb-408d-8ad6-f4272af25922" containerName="mariadb-account-create-update" Mar 19 15:35:30 crc kubenswrapper[4771]: I0319 15:35:30.344712 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="67ed00e9-ecdb-408d-8ad6-f4272af25922" containerName="mariadb-account-create-update" Mar 19 15:35:30 crc kubenswrapper[4771]: E0319 15:35:30.344724 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="895ea0c2-2780-4e20-8d2d-ed5c378c6cfe" containerName="mariadb-account-create-update" Mar 19 15:35:30 crc kubenswrapper[4771]: I0319 15:35:30.344729 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="895ea0c2-2780-4e20-8d2d-ed5c378c6cfe" containerName="mariadb-account-create-update" Mar 19 15:35:30 crc kubenswrapper[4771]: E0319 15:35:30.344740 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67c93ca3-5940-420d-9ab0-e5a0d2a23964" containerName="mariadb-database-create" Mar 19 15:35:30 crc kubenswrapper[4771]: I0319 15:35:30.344745 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="67c93ca3-5940-420d-9ab0-e5a0d2a23964" containerName="mariadb-database-create" Mar 19 15:35:30 crc kubenswrapper[4771]: I0319 15:35:30.344875 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="67ed00e9-ecdb-408d-8ad6-f4272af25922" containerName="mariadb-account-create-update" Mar 19 15:35:30 crc kubenswrapper[4771]: I0319 15:35:30.344886 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="895ea0c2-2780-4e20-8d2d-ed5c378c6cfe" containerName="mariadb-account-create-update" Mar 19 15:35:30 crc kubenswrapper[4771]: I0319 15:35:30.344899 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="67c93ca3-5940-420d-9ab0-e5a0d2a23964" containerName="mariadb-database-create" Mar 19 15:35:30 crc kubenswrapper[4771]: I0319 15:35:30.344906 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ab31eac-215d-4527-80f1-68ef6224e8ed" containerName="dnsmasq-dns" Mar 19 15:35:30 crc kubenswrapper[4771]: I0319 15:35:30.344918 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d50a3cd-a539-4f95-b8e1-e157be63cc4d" containerName="mariadb-database-create" Mar 19 15:35:30 crc kubenswrapper[4771]: I0319 15:35:30.344926 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c0318a8-d871-42c6-aaa9-4c6f07bb90a8" containerName="mariadb-database-create" Mar 19 15:35:30 crc kubenswrapper[4771]: I0319 15:35:30.344934 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="896559b5-ae6f-439f-be12-bcd65a28e5ec" containerName="mariadb-account-create-update" Mar 19 15:35:30 crc kubenswrapper[4771]: I0319 15:35:30.345402 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tdqvp" Mar 19 15:35:30 crc kubenswrapper[4771]: I0319 15:35:30.348408 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 19 15:35:30 crc kubenswrapper[4771]: I0319 15:35:30.354369 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-tdqvp"] Mar 19 15:35:30 crc kubenswrapper[4771]: I0319 15:35:30.400176 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llp2c\" (UniqueName: \"kubernetes.io/projected/f500993b-fcda-483e-a93c-b56f12ded0c0-kube-api-access-llp2c\") pod \"root-account-create-update-tdqvp\" (UID: \"f500993b-fcda-483e-a93c-b56f12ded0c0\") " pod="openstack/root-account-create-update-tdqvp" Mar 19 15:35:30 crc kubenswrapper[4771]: I0319 15:35:30.400558 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f500993b-fcda-483e-a93c-b56f12ded0c0-operator-scripts\") pod \"root-account-create-update-tdqvp\" (UID: \"f500993b-fcda-483e-a93c-b56f12ded0c0\") " pod="openstack/root-account-create-update-tdqvp" Mar 19 15:35:30 crc kubenswrapper[4771]: I0319 15:35:30.502377 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f500993b-fcda-483e-a93c-b56f12ded0c0-operator-scripts\") pod \"root-account-create-update-tdqvp\" (UID: \"f500993b-fcda-483e-a93c-b56f12ded0c0\") " pod="openstack/root-account-create-update-tdqvp" Mar 19 15:35:30 crc kubenswrapper[4771]: I0319 15:35:30.502800 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llp2c\" (UniqueName: \"kubernetes.io/projected/f500993b-fcda-483e-a93c-b56f12ded0c0-kube-api-access-llp2c\") pod \"root-account-create-update-tdqvp\" (UID: \"f500993b-fcda-483e-a93c-b56f12ded0c0\") " pod="openstack/root-account-create-update-tdqvp" Mar 19 15:35:30 crc kubenswrapper[4771]: I0319 15:35:30.503436 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f500993b-fcda-483e-a93c-b56f12ded0c0-operator-scripts\") pod \"root-account-create-update-tdqvp\" (UID: \"f500993b-fcda-483e-a93c-b56f12ded0c0\") " pod="openstack/root-account-create-update-tdqvp" Mar 19 15:35:30 crc kubenswrapper[4771]: I0319 15:35:30.521884 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llp2c\" (UniqueName: \"kubernetes.io/projected/f500993b-fcda-483e-a93c-b56f12ded0c0-kube-api-access-llp2c\") pod \"root-account-create-update-tdqvp\" (UID: \"f500993b-fcda-483e-a93c-b56f12ded0c0\") " pod="openstack/root-account-create-update-tdqvp" Mar 19 15:35:30 crc kubenswrapper[4771]: I0319 15:35:30.667592 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tdqvp" Mar 19 15:35:31 crc kubenswrapper[4771]: I0319 15:35:31.084824 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-tdqvp"] Mar 19 15:35:31 crc kubenswrapper[4771]: I0319 15:35:31.196378 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tdqvp" event={"ID":"f500993b-fcda-483e-a93c-b56f12ded0c0","Type":"ContainerStarted","Data":"2f1b00da9e872cfa25b82d8e6b7dbc1c6dba4db4f63bedbe7a1a3eabf2226f58"} Mar 19 15:35:31 crc kubenswrapper[4771]: I0319 15:35:31.198282 4771 generic.go:334] "Generic (PLEG): container finished" podID="fde481a0-3182-4ad8-90ff-7fc8da0ecde2" containerID="8148eb063ea59c6f89b97ff5a1c5e773b8b9477028f05e22f75ed6ee94846bf3" exitCode=0 Mar 19 15:35:31 crc kubenswrapper[4771]: I0319 15:35:31.198332 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7tbkt" event={"ID":"fde481a0-3182-4ad8-90ff-7fc8da0ecde2","Type":"ContainerDied","Data":"8148eb063ea59c6f89b97ff5a1c5e773b8b9477028f05e22f75ed6ee94846bf3"} Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.210292 4771 generic.go:334] "Generic (PLEG): container finished" podID="74c5f622-0ced-47f9-80d5-75a09acfafc0" containerID="8d82de8c0a9a55c60c139dfff637c54b671c7709788c1b5b12ec26d65e83f90e" exitCode=0 Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.210360 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74c5f622-0ced-47f9-80d5-75a09acfafc0","Type":"ContainerDied","Data":"8d82de8c0a9a55c60c139dfff637c54b671c7709788c1b5b12ec26d65e83f90e"} Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.214543 4771 generic.go:334] "Generic (PLEG): container finished" podID="c065c328-37e2-4905-9d1e-82208eab196e" containerID="3846aac8cb06f5b9e133188e72469bd0f51a9cfb84c06e142e5eaccd41e6326d" exitCode=0 Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.214709 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c065c328-37e2-4905-9d1e-82208eab196e","Type":"ContainerDied","Data":"3846aac8cb06f5b9e133188e72469bd0f51a9cfb84c06e142e5eaccd41e6326d"} Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.219555 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tdqvp" event={"ID":"f500993b-fcda-483e-a93c-b56f12ded0c0","Type":"ContainerStarted","Data":"dfee18827bb3d9a90d2cbe70a5ad81a401b0cfbc305a2adacf8eccca2a8c2acf"} Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.272407 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-tdqvp" podStartSLOduration=2.272393577 podStartE2EDuration="2.272393577s" podCreationTimestamp="2026-03-19 15:35:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:35:32.271170408 +0000 UTC m=+1191.499791610" watchObservedRunningTime="2026-03-19 15:35:32.272393577 +0000 UTC m=+1191.501014769" Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.500163 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7tbkt" Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.550648 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-ring-data-devices\") pod \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\" (UID: \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\") " Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.550708 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-etc-swift\") pod \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\" (UID: \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\") " Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.550758 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5kpb\" (UniqueName: \"kubernetes.io/projected/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-kube-api-access-k5kpb\") pod \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\" (UID: \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\") " Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.550798 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-scripts\") pod \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\" (UID: \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\") " Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.550916 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-combined-ca-bundle\") pod \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\" (UID: \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\") " Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.550981 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-dispersionconf\") pod \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\" (UID: \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\") " Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.551037 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-swiftconf\") pod \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\" (UID: \"fde481a0-3182-4ad8-90ff-7fc8da0ecde2\") " Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.551412 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "fde481a0-3182-4ad8-90ff-7fc8da0ecde2" (UID: "fde481a0-3182-4ad8-90ff-7fc8da0ecde2"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.551690 4771 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.552284 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "fde481a0-3182-4ad8-90ff-7fc8da0ecde2" (UID: "fde481a0-3182-4ad8-90ff-7fc8da0ecde2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.559209 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-kube-api-access-k5kpb" (OuterVolumeSpecName: "kube-api-access-k5kpb") pod "fde481a0-3182-4ad8-90ff-7fc8da0ecde2" (UID: "fde481a0-3182-4ad8-90ff-7fc8da0ecde2"). InnerVolumeSpecName "kube-api-access-k5kpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.562646 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "fde481a0-3182-4ad8-90ff-7fc8da0ecde2" (UID: "fde481a0-3182-4ad8-90ff-7fc8da0ecde2"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.570219 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-scripts" (OuterVolumeSpecName: "scripts") pod "fde481a0-3182-4ad8-90ff-7fc8da0ecde2" (UID: "fde481a0-3182-4ad8-90ff-7fc8da0ecde2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.579291 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fde481a0-3182-4ad8-90ff-7fc8da0ecde2" (UID: "fde481a0-3182-4ad8-90ff-7fc8da0ecde2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.584927 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "fde481a0-3182-4ad8-90ff-7fc8da0ecde2" (UID: "fde481a0-3182-4ad8-90ff-7fc8da0ecde2"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.652702 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67d58e24-649b-4142-a62a-64c9919fe0e4-etc-swift\") pod \"swift-storage-0\" (UID: \"67d58e24-649b-4142-a62a-64c9919fe0e4\") " pod="openstack/swift-storage-0" Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.652839 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5kpb\" (UniqueName: \"kubernetes.io/projected/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-kube-api-access-k5kpb\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.652851 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.652862 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.652871 4771 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.652879 4771 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.652886 4771 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fde481a0-3182-4ad8-90ff-7fc8da0ecde2-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.656612 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67d58e24-649b-4142-a62a-64c9919fe0e4-etc-swift\") pod \"swift-storage-0\" (UID: \"67d58e24-649b-4142-a62a-64c9919fe0e4\") " pod="openstack/swift-storage-0" Mar 19 15:35:32 crc kubenswrapper[4771]: I0319 15:35:32.939106 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 19 15:35:33 crc kubenswrapper[4771]: I0319 15:35:33.228571 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74c5f622-0ced-47f9-80d5-75a09acfafc0","Type":"ContainerStarted","Data":"d9f43f1e7de39a493f7959b4f45870b1b227f88706b71e937eaab9ff6aaa0c04"} Mar 19 15:35:33 crc kubenswrapper[4771]: I0319 15:35:33.229126 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 19 15:35:33 crc kubenswrapper[4771]: I0319 15:35:33.239423 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7tbkt" Mar 19 15:35:33 crc kubenswrapper[4771]: I0319 15:35:33.239828 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7tbkt" event={"ID":"fde481a0-3182-4ad8-90ff-7fc8da0ecde2","Type":"ContainerDied","Data":"7964ba80782b5c5d3f48e53d5b86497f3530b9e47e3538f1b0f84ad8fcea4bd5"} Mar 19 15:35:33 crc kubenswrapper[4771]: I0319 15:35:33.239871 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7964ba80782b5c5d3f48e53d5b86497f3530b9e47e3538f1b0f84ad8fcea4bd5" Mar 19 15:35:33 crc kubenswrapper[4771]: I0319 15:35:33.247918 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c065c328-37e2-4905-9d1e-82208eab196e","Type":"ContainerStarted","Data":"46cae5d55136c70e0709cacbf082bf077e68ce5a3bc258945276fab541b2cf08"} Mar 19 15:35:33 crc kubenswrapper[4771]: I0319 15:35:33.248911 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:35:33 crc kubenswrapper[4771]: I0319 15:35:33.250866 4771 generic.go:334] "Generic (PLEG): container finished" podID="f500993b-fcda-483e-a93c-b56f12ded0c0" containerID="dfee18827bb3d9a90d2cbe70a5ad81a401b0cfbc305a2adacf8eccca2a8c2acf" exitCode=0 Mar 19 15:35:33 crc kubenswrapper[4771]: I0319 15:35:33.250909 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tdqvp" event={"ID":"f500993b-fcda-483e-a93c-b56f12ded0c0","Type":"ContainerDied","Data":"dfee18827bb3d9a90d2cbe70a5ad81a401b0cfbc305a2adacf8eccca2a8c2acf"} Mar 19 15:35:33 crc kubenswrapper[4771]: I0319 15:35:33.268117 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.788444529 podStartE2EDuration="55.268091677s" podCreationTimestamp="2026-03-19 15:34:38 +0000 UTC" firstStartedPulling="2026-03-19 15:34:40.4053178 +0000 UTC m=+1139.633939002" lastFinishedPulling="2026-03-19 15:34:56.884964938 +0000 UTC m=+1156.113586150" observedRunningTime="2026-03-19 15:35:33.262241707 +0000 UTC m=+1192.490862909" watchObservedRunningTime="2026-03-19 15:35:33.268091677 +0000 UTC m=+1192.496712889" Mar 19 15:35:33 crc kubenswrapper[4771]: I0319 15:35:33.298245 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=43.404359143 podStartE2EDuration="55.298215198s" podCreationTimestamp="2026-03-19 15:34:38 +0000 UTC" firstStartedPulling="2026-03-19 15:34:45.06944879 +0000 UTC m=+1144.298070002" lastFinishedPulling="2026-03-19 15:34:56.963304855 +0000 UTC m=+1156.191926057" observedRunningTime="2026-03-19 15:35:33.297164824 +0000 UTC m=+1192.525786026" watchObservedRunningTime="2026-03-19 15:35:33.298215198 +0000 UTC m=+1192.526836400" Mar 19 15:35:33 crc kubenswrapper[4771]: I0319 15:35:33.474378 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 19 15:35:33 crc kubenswrapper[4771]: W0319 15:35:33.475717 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67d58e24_649b_4142_a62a_64c9919fe0e4.slice/crio-9db33406f599495b6a6db58f5fc3773c0bd71f8495afb7ab46ba4ae0a5cfd963 WatchSource:0}: Error finding container 9db33406f599495b6a6db58f5fc3773c0bd71f8495afb7ab46ba4ae0a5cfd963: Status 404 returned error can't find the container with id 9db33406f599495b6a6db58f5fc3773c0bd71f8495afb7ab46ba4ae0a5cfd963 Mar 19 15:35:33 crc kubenswrapper[4771]: I0319 15:35:33.959378 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-mwcmg"] Mar 19 15:35:33 crc kubenswrapper[4771]: E0319 15:35:33.959938 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fde481a0-3182-4ad8-90ff-7fc8da0ecde2" containerName="swift-ring-rebalance" Mar 19 15:35:33 crc kubenswrapper[4771]: I0319 15:35:33.959950 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="fde481a0-3182-4ad8-90ff-7fc8da0ecde2" containerName="swift-ring-rebalance" Mar 19 15:35:33 crc kubenswrapper[4771]: I0319 15:35:33.960166 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="fde481a0-3182-4ad8-90ff-7fc8da0ecde2" containerName="swift-ring-rebalance" Mar 19 15:35:33 crc kubenswrapper[4771]: I0319 15:35:33.960670 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mwcmg" Mar 19 15:35:33 crc kubenswrapper[4771]: I0319 15:35:33.966394 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 19 15:35:33 crc kubenswrapper[4771]: I0319 15:35:33.966583 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-xkksh" Mar 19 15:35:33 crc kubenswrapper[4771]: I0319 15:35:33.981141 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-mwcmg"] Mar 19 15:35:34 crc kubenswrapper[4771]: I0319 15:35:34.080771 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04404b99-3d2c-42a9-8137-f9f7991d2d06-combined-ca-bundle\") pod \"glance-db-sync-mwcmg\" (UID: \"04404b99-3d2c-42a9-8137-f9f7991d2d06\") " pod="openstack/glance-db-sync-mwcmg" Mar 19 15:35:34 crc kubenswrapper[4771]: I0319 15:35:34.080831 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04404b99-3d2c-42a9-8137-f9f7991d2d06-config-data\") pod \"glance-db-sync-mwcmg\" (UID: \"04404b99-3d2c-42a9-8137-f9f7991d2d06\") " pod="openstack/glance-db-sync-mwcmg" Mar 19 15:35:34 crc kubenswrapper[4771]: I0319 15:35:34.081358 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgjzr\" (UniqueName: \"kubernetes.io/projected/04404b99-3d2c-42a9-8137-f9f7991d2d06-kube-api-access-dgjzr\") pod \"glance-db-sync-mwcmg\" (UID: \"04404b99-3d2c-42a9-8137-f9f7991d2d06\") " pod="openstack/glance-db-sync-mwcmg" Mar 19 15:35:34 crc kubenswrapper[4771]: I0319 15:35:34.081646 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/04404b99-3d2c-42a9-8137-f9f7991d2d06-db-sync-config-data\") pod \"glance-db-sync-mwcmg\" (UID: \"04404b99-3d2c-42a9-8137-f9f7991d2d06\") " pod="openstack/glance-db-sync-mwcmg" Mar 19 15:35:34 crc kubenswrapper[4771]: I0319 15:35:34.183216 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04404b99-3d2c-42a9-8137-f9f7991d2d06-combined-ca-bundle\") pod \"glance-db-sync-mwcmg\" (UID: \"04404b99-3d2c-42a9-8137-f9f7991d2d06\") " pod="openstack/glance-db-sync-mwcmg" Mar 19 15:35:34 crc kubenswrapper[4771]: I0319 15:35:34.183316 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04404b99-3d2c-42a9-8137-f9f7991d2d06-config-data\") pod \"glance-db-sync-mwcmg\" (UID: \"04404b99-3d2c-42a9-8137-f9f7991d2d06\") " pod="openstack/glance-db-sync-mwcmg" Mar 19 15:35:34 crc kubenswrapper[4771]: I0319 15:35:34.183457 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgjzr\" (UniqueName: \"kubernetes.io/projected/04404b99-3d2c-42a9-8137-f9f7991d2d06-kube-api-access-dgjzr\") pod \"glance-db-sync-mwcmg\" (UID: \"04404b99-3d2c-42a9-8137-f9f7991d2d06\") " pod="openstack/glance-db-sync-mwcmg" Mar 19 15:35:34 crc kubenswrapper[4771]: I0319 15:35:34.183526 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/04404b99-3d2c-42a9-8137-f9f7991d2d06-db-sync-config-data\") pod \"glance-db-sync-mwcmg\" (UID: \"04404b99-3d2c-42a9-8137-f9f7991d2d06\") " pod="openstack/glance-db-sync-mwcmg" Mar 19 15:35:34 crc kubenswrapper[4771]: I0319 15:35:34.191242 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04404b99-3d2c-42a9-8137-f9f7991d2d06-combined-ca-bundle\") pod \"glance-db-sync-mwcmg\" (UID: \"04404b99-3d2c-42a9-8137-f9f7991d2d06\") " pod="openstack/glance-db-sync-mwcmg" Mar 19 15:35:34 crc kubenswrapper[4771]: I0319 15:35:34.198718 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04404b99-3d2c-42a9-8137-f9f7991d2d06-config-data\") pod \"glance-db-sync-mwcmg\" (UID: \"04404b99-3d2c-42a9-8137-f9f7991d2d06\") " pod="openstack/glance-db-sync-mwcmg" Mar 19 15:35:34 crc kubenswrapper[4771]: I0319 15:35:34.200409 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/04404b99-3d2c-42a9-8137-f9f7991d2d06-db-sync-config-data\") pod \"glance-db-sync-mwcmg\" (UID: \"04404b99-3d2c-42a9-8137-f9f7991d2d06\") " pod="openstack/glance-db-sync-mwcmg" Mar 19 15:35:34 crc kubenswrapper[4771]: I0319 15:35:34.202568 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgjzr\" (UniqueName: \"kubernetes.io/projected/04404b99-3d2c-42a9-8137-f9f7991d2d06-kube-api-access-dgjzr\") pod \"glance-db-sync-mwcmg\" (UID: \"04404b99-3d2c-42a9-8137-f9f7991d2d06\") " pod="openstack/glance-db-sync-mwcmg" Mar 19 15:35:34 crc kubenswrapper[4771]: I0319 15:35:34.258856 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67d58e24-649b-4142-a62a-64c9919fe0e4","Type":"ContainerStarted","Data":"9db33406f599495b6a6db58f5fc3773c0bd71f8495afb7ab46ba4ae0a5cfd963"} Mar 19 15:35:34 crc kubenswrapper[4771]: I0319 15:35:34.280920 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mwcmg" Mar 19 15:35:34 crc kubenswrapper[4771]: I0319 15:35:34.742134 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tdqvp" Mar 19 15:35:34 crc kubenswrapper[4771]: I0319 15:35:34.813122 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f500993b-fcda-483e-a93c-b56f12ded0c0-operator-scripts\") pod \"f500993b-fcda-483e-a93c-b56f12ded0c0\" (UID: \"f500993b-fcda-483e-a93c-b56f12ded0c0\") " Mar 19 15:35:34 crc kubenswrapper[4771]: I0319 15:35:34.813540 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llp2c\" (UniqueName: \"kubernetes.io/projected/f500993b-fcda-483e-a93c-b56f12ded0c0-kube-api-access-llp2c\") pod \"f500993b-fcda-483e-a93c-b56f12ded0c0\" (UID: \"f500993b-fcda-483e-a93c-b56f12ded0c0\") " Mar 19 15:35:34 crc kubenswrapper[4771]: I0319 15:35:34.813906 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f500993b-fcda-483e-a93c-b56f12ded0c0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f500993b-fcda-483e-a93c-b56f12ded0c0" (UID: "f500993b-fcda-483e-a93c-b56f12ded0c0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:35:34 crc kubenswrapper[4771]: I0319 15:35:34.814416 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f500993b-fcda-483e-a93c-b56f12ded0c0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:34 crc kubenswrapper[4771]: I0319 15:35:34.820552 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f500993b-fcda-483e-a93c-b56f12ded0c0-kube-api-access-llp2c" (OuterVolumeSpecName: "kube-api-access-llp2c") pod "f500993b-fcda-483e-a93c-b56f12ded0c0" (UID: "f500993b-fcda-483e-a93c-b56f12ded0c0"). InnerVolumeSpecName "kube-api-access-llp2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:35:34 crc kubenswrapper[4771]: I0319 15:35:34.915978 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llp2c\" (UniqueName: \"kubernetes.io/projected/f500993b-fcda-483e-a93c-b56f12ded0c0-kube-api-access-llp2c\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:34 crc kubenswrapper[4771]: I0319 15:35:34.929671 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-mwcmg"] Mar 19 15:35:34 crc kubenswrapper[4771]: I0319 15:35:34.939322 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 19 15:35:35 crc kubenswrapper[4771]: W0319 15:35:35.175319 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04404b99_3d2c_42a9_8137_f9f7991d2d06.slice/crio-46d2f0639f332086eff706438ac5cdfd488d36e9d26219e69db8f95d90d40814 WatchSource:0}: Error finding container 46d2f0639f332086eff706438ac5cdfd488d36e9d26219e69db8f95d90d40814: Status 404 returned error can't find the container with id 46d2f0639f332086eff706438ac5cdfd488d36e9d26219e69db8f95d90d40814 Mar 19 15:35:35 crc kubenswrapper[4771]: I0319 15:35:35.270133 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tdqvp" Mar 19 15:35:35 crc kubenswrapper[4771]: I0319 15:35:35.270130 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tdqvp" event={"ID":"f500993b-fcda-483e-a93c-b56f12ded0c0","Type":"ContainerDied","Data":"2f1b00da9e872cfa25b82d8e6b7dbc1c6dba4db4f63bedbe7a1a3eabf2226f58"} Mar 19 15:35:35 crc kubenswrapper[4771]: I0319 15:35:35.270267 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f1b00da9e872cfa25b82d8e6b7dbc1c6dba4db4f63bedbe7a1a3eabf2226f58" Mar 19 15:35:35 crc kubenswrapper[4771]: I0319 15:35:35.271538 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mwcmg" event={"ID":"04404b99-3d2c-42a9-8137-f9f7991d2d06","Type":"ContainerStarted","Data":"46d2f0639f332086eff706438ac5cdfd488d36e9d26219e69db8f95d90d40814"} Mar 19 15:35:36 crc kubenswrapper[4771]: I0319 15:35:36.280354 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67d58e24-649b-4142-a62a-64c9919fe0e4","Type":"ContainerStarted","Data":"bf7fcafbe551f99941560878ad4d5df1561c64db1ac7e05e77d3810d035b9d75"} Mar 19 15:35:36 crc kubenswrapper[4771]: I0319 15:35:36.280976 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67d58e24-649b-4142-a62a-64c9919fe0e4","Type":"ContainerStarted","Data":"bb573b0d1cbce392e58e2d93f4198bfb15e1f163c43f8ab4589c8eca3279c6ca"} Mar 19 15:35:36 crc kubenswrapper[4771]: I0319 15:35:36.281005 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67d58e24-649b-4142-a62a-64c9919fe0e4","Type":"ContainerStarted","Data":"4be3369e645b195906974e36e71684d8ed795e1ae8b7e61c14e631b8c0472a89"} Mar 19 15:35:36 crc kubenswrapper[4771]: I0319 15:35:36.281015 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67d58e24-649b-4142-a62a-64c9919fe0e4","Type":"ContainerStarted","Data":"d07cbe6ee5292e0f233584f758512e943a0de9fdc3e7b38be38c4a878c072e75"} Mar 19 15:35:36 crc kubenswrapper[4771]: I0319 15:35:36.647620 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-tdqvp"] Mar 19 15:35:36 crc kubenswrapper[4771]: I0319 15:35:36.656292 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-tdqvp"] Mar 19 15:35:37 crc kubenswrapper[4771]: I0319 15:35:37.539508 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f500993b-fcda-483e-a93c-b56f12ded0c0" path="/var/lib/kubelet/pods/f500993b-fcda-483e-a93c-b56f12ded0c0/volumes" Mar 19 15:35:38 crc kubenswrapper[4771]: I0319 15:35:38.296061 4771 generic.go:334] "Generic (PLEG): container finished" podID="c065c328-37e2-4905-9d1e-82208eab196e" containerID="46cae5d55136c70e0709cacbf082bf077e68ce5a3bc258945276fab541b2cf08" exitCode=0 Mar 19 15:35:38 crc kubenswrapper[4771]: I0319 15:35:38.296098 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c065c328-37e2-4905-9d1e-82208eab196e","Type":"ContainerDied","Data":"46cae5d55136c70e0709cacbf082bf077e68ce5a3bc258945276fab541b2cf08"} Mar 19 15:35:38 crc kubenswrapper[4771]: I0319 15:35:38.297290 4771 scope.go:117] "RemoveContainer" containerID="46cae5d55136c70e0709cacbf082bf077e68ce5a3bc258945276fab541b2cf08" Mar 19 15:35:38 crc kubenswrapper[4771]: I0319 15:35:38.305242 4771 generic.go:334] "Generic (PLEG): container finished" podID="74c5f622-0ced-47f9-80d5-75a09acfafc0" containerID="d9f43f1e7de39a493f7959b4f45870b1b227f88706b71e937eaab9ff6aaa0c04" exitCode=0 Mar 19 15:35:38 crc kubenswrapper[4771]: I0319 15:35:38.305302 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74c5f622-0ced-47f9-80d5-75a09acfafc0","Type":"ContainerDied","Data":"d9f43f1e7de39a493f7959b4f45870b1b227f88706b71e937eaab9ff6aaa0c04"} Mar 19 15:35:38 crc kubenswrapper[4771]: I0319 15:35:38.306078 4771 scope.go:117] "RemoveContainer" containerID="d9f43f1e7de39a493f7959b4f45870b1b227f88706b71e937eaab9ff6aaa0c04" Mar 19 15:35:38 crc kubenswrapper[4771]: I0319 15:35:38.784498 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-w5jsx" podUID="e8c6ed19-f258-4da0-966a-6c538b85dce1" containerName="ovn-controller" probeResult="failure" output=< Mar 19 15:35:38 crc kubenswrapper[4771]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 19 15:35:38 crc kubenswrapper[4771]: > Mar 19 15:35:39 crc kubenswrapper[4771]: I0319 15:35:39.317022 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c065c328-37e2-4905-9d1e-82208eab196e","Type":"ContainerStarted","Data":"8f6a3c52d28bf57cc459d662dd200afc3fe2897a7b69316267619435cbdac8b9"} Mar 19 15:35:39 crc kubenswrapper[4771]: I0319 15:35:39.318816 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:35:39 crc kubenswrapper[4771]: I0319 15:35:39.322754 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74c5f622-0ced-47f9-80d5-75a09acfafc0","Type":"ContainerStarted","Data":"82f408c9a22bfb4646611de514397e4072fd17ead089d0bd1dc250f8a4865cb2"} Mar 19 15:35:39 crc kubenswrapper[4771]: I0319 15:35:39.323138 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 19 15:35:40 crc kubenswrapper[4771]: I0319 15:35:40.382530 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-78dl7"] Mar 19 15:35:40 crc kubenswrapper[4771]: E0319 15:35:40.383478 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f500993b-fcda-483e-a93c-b56f12ded0c0" containerName="mariadb-account-create-update" Mar 19 15:35:40 crc kubenswrapper[4771]: I0319 15:35:40.383508 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f500993b-fcda-483e-a93c-b56f12ded0c0" containerName="mariadb-account-create-update" Mar 19 15:35:40 crc kubenswrapper[4771]: I0319 15:35:40.383663 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f500993b-fcda-483e-a93c-b56f12ded0c0" containerName="mariadb-account-create-update" Mar 19 15:35:40 crc kubenswrapper[4771]: I0319 15:35:40.384279 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-78dl7" Mar 19 15:35:40 crc kubenswrapper[4771]: I0319 15:35:40.387203 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 19 15:35:40 crc kubenswrapper[4771]: I0319 15:35:40.391663 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-78dl7"] Mar 19 15:35:40 crc kubenswrapper[4771]: I0319 15:35:40.520272 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcbwb\" (UniqueName: \"kubernetes.io/projected/1e5b34d9-0789-42e6-aa3d-1f1996f6befc-kube-api-access-lcbwb\") pod \"root-account-create-update-78dl7\" (UID: \"1e5b34d9-0789-42e6-aa3d-1f1996f6befc\") " pod="openstack/root-account-create-update-78dl7" Mar 19 15:35:40 crc kubenswrapper[4771]: I0319 15:35:40.520467 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e5b34d9-0789-42e6-aa3d-1f1996f6befc-operator-scripts\") pod \"root-account-create-update-78dl7\" (UID: \"1e5b34d9-0789-42e6-aa3d-1f1996f6befc\") " pod="openstack/root-account-create-update-78dl7" Mar 19 15:35:40 crc kubenswrapper[4771]: I0319 15:35:40.621494 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e5b34d9-0789-42e6-aa3d-1f1996f6befc-operator-scripts\") pod \"root-account-create-update-78dl7\" (UID: \"1e5b34d9-0789-42e6-aa3d-1f1996f6befc\") " pod="openstack/root-account-create-update-78dl7" Mar 19 15:35:40 crc kubenswrapper[4771]: I0319 15:35:40.621656 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcbwb\" (UniqueName: \"kubernetes.io/projected/1e5b34d9-0789-42e6-aa3d-1f1996f6befc-kube-api-access-lcbwb\") pod \"root-account-create-update-78dl7\" (UID: \"1e5b34d9-0789-42e6-aa3d-1f1996f6befc\") " pod="openstack/root-account-create-update-78dl7" Mar 19 15:35:40 crc kubenswrapper[4771]: I0319 15:35:40.622758 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e5b34d9-0789-42e6-aa3d-1f1996f6befc-operator-scripts\") pod \"root-account-create-update-78dl7\" (UID: \"1e5b34d9-0789-42e6-aa3d-1f1996f6befc\") " pod="openstack/root-account-create-update-78dl7" Mar 19 15:35:40 crc kubenswrapper[4771]: I0319 15:35:40.646233 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcbwb\" (UniqueName: \"kubernetes.io/projected/1e5b34d9-0789-42e6-aa3d-1f1996f6befc-kube-api-access-lcbwb\") pod \"root-account-create-update-78dl7\" (UID: \"1e5b34d9-0789-42e6-aa3d-1f1996f6befc\") " pod="openstack/root-account-create-update-78dl7" Mar 19 15:35:40 crc kubenswrapper[4771]: I0319 15:35:40.708623 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-78dl7" Mar 19 15:35:43 crc kubenswrapper[4771]: I0319 15:35:43.781026 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-w5jsx" podUID="e8c6ed19-f258-4da0-966a-6c538b85dce1" containerName="ovn-controller" probeResult="failure" output=< Mar 19 15:35:43 crc kubenswrapper[4771]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 19 15:35:43 crc kubenswrapper[4771]: > Mar 19 15:35:43 crc kubenswrapper[4771]: I0319 15:35:43.791446 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-rs8tv" Mar 19 15:35:43 crc kubenswrapper[4771]: I0319 15:35:43.810544 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-rs8tv" Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.025204 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-w5jsx-config-7tzlt"] Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.026273 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-w5jsx-config-7tzlt" Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.029093 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.033928 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-w5jsx-config-7tzlt"] Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.178199 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fed51b11-5d52-4863-9987-3d819310fc2f-var-log-ovn\") pod \"ovn-controller-w5jsx-config-7tzlt\" (UID: \"fed51b11-5d52-4863-9987-3d819310fc2f\") " pod="openstack/ovn-controller-w5jsx-config-7tzlt" Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.178267 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fed51b11-5d52-4863-9987-3d819310fc2f-var-run-ovn\") pod \"ovn-controller-w5jsx-config-7tzlt\" (UID: \"fed51b11-5d52-4863-9987-3d819310fc2f\") " pod="openstack/ovn-controller-w5jsx-config-7tzlt" Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.178299 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fed51b11-5d52-4863-9987-3d819310fc2f-var-run\") pod \"ovn-controller-w5jsx-config-7tzlt\" (UID: \"fed51b11-5d52-4863-9987-3d819310fc2f\") " pod="openstack/ovn-controller-w5jsx-config-7tzlt" Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.178373 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fed51b11-5d52-4863-9987-3d819310fc2f-scripts\") pod \"ovn-controller-w5jsx-config-7tzlt\" (UID: \"fed51b11-5d52-4863-9987-3d819310fc2f\") " pod="openstack/ovn-controller-w5jsx-config-7tzlt" Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.178444 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fed51b11-5d52-4863-9987-3d819310fc2f-additional-scripts\") pod \"ovn-controller-w5jsx-config-7tzlt\" (UID: \"fed51b11-5d52-4863-9987-3d819310fc2f\") " pod="openstack/ovn-controller-w5jsx-config-7tzlt" Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.178480 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtxm6\" (UniqueName: \"kubernetes.io/projected/fed51b11-5d52-4863-9987-3d819310fc2f-kube-api-access-wtxm6\") pod \"ovn-controller-w5jsx-config-7tzlt\" (UID: \"fed51b11-5d52-4863-9987-3d819310fc2f\") " pod="openstack/ovn-controller-w5jsx-config-7tzlt" Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.279557 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fed51b11-5d52-4863-9987-3d819310fc2f-additional-scripts\") pod \"ovn-controller-w5jsx-config-7tzlt\" (UID: \"fed51b11-5d52-4863-9987-3d819310fc2f\") " pod="openstack/ovn-controller-w5jsx-config-7tzlt" Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.279618 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtxm6\" (UniqueName: \"kubernetes.io/projected/fed51b11-5d52-4863-9987-3d819310fc2f-kube-api-access-wtxm6\") pod \"ovn-controller-w5jsx-config-7tzlt\" (UID: \"fed51b11-5d52-4863-9987-3d819310fc2f\") " pod="openstack/ovn-controller-w5jsx-config-7tzlt" Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.279675 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fed51b11-5d52-4863-9987-3d819310fc2f-var-log-ovn\") pod \"ovn-controller-w5jsx-config-7tzlt\" (UID: \"fed51b11-5d52-4863-9987-3d819310fc2f\") " pod="openstack/ovn-controller-w5jsx-config-7tzlt" Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.279708 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fed51b11-5d52-4863-9987-3d819310fc2f-var-run-ovn\") pod \"ovn-controller-w5jsx-config-7tzlt\" (UID: \"fed51b11-5d52-4863-9987-3d819310fc2f\") " pod="openstack/ovn-controller-w5jsx-config-7tzlt" Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.279724 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fed51b11-5d52-4863-9987-3d819310fc2f-var-run\") pod \"ovn-controller-w5jsx-config-7tzlt\" (UID: \"fed51b11-5d52-4863-9987-3d819310fc2f\") " pod="openstack/ovn-controller-w5jsx-config-7tzlt" Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.279761 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fed51b11-5d52-4863-9987-3d819310fc2f-scripts\") pod \"ovn-controller-w5jsx-config-7tzlt\" (UID: \"fed51b11-5d52-4863-9987-3d819310fc2f\") " pod="openstack/ovn-controller-w5jsx-config-7tzlt" Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.280074 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fed51b11-5d52-4863-9987-3d819310fc2f-var-run-ovn\") pod \"ovn-controller-w5jsx-config-7tzlt\" (UID: \"fed51b11-5d52-4863-9987-3d819310fc2f\") " pod="openstack/ovn-controller-w5jsx-config-7tzlt" Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.280074 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fed51b11-5d52-4863-9987-3d819310fc2f-var-log-ovn\") pod \"ovn-controller-w5jsx-config-7tzlt\" (UID: \"fed51b11-5d52-4863-9987-3d819310fc2f\") " pod="openstack/ovn-controller-w5jsx-config-7tzlt" Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.280139 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fed51b11-5d52-4863-9987-3d819310fc2f-var-run\") pod \"ovn-controller-w5jsx-config-7tzlt\" (UID: \"fed51b11-5d52-4863-9987-3d819310fc2f\") " pod="openstack/ovn-controller-w5jsx-config-7tzlt" Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.280721 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fed51b11-5d52-4863-9987-3d819310fc2f-additional-scripts\") pod \"ovn-controller-w5jsx-config-7tzlt\" (UID: \"fed51b11-5d52-4863-9987-3d819310fc2f\") " pod="openstack/ovn-controller-w5jsx-config-7tzlt" Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.281722 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fed51b11-5d52-4863-9987-3d819310fc2f-scripts\") pod \"ovn-controller-w5jsx-config-7tzlt\" (UID: \"fed51b11-5d52-4863-9987-3d819310fc2f\") " pod="openstack/ovn-controller-w5jsx-config-7tzlt" Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.299037 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtxm6\" (UniqueName: \"kubernetes.io/projected/fed51b11-5d52-4863-9987-3d819310fc2f-kube-api-access-wtxm6\") pod \"ovn-controller-w5jsx-config-7tzlt\" (UID: \"fed51b11-5d52-4863-9987-3d819310fc2f\") " pod="openstack/ovn-controller-w5jsx-config-7tzlt" Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.346022 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-w5jsx-config-7tzlt" Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.363546 4771 generic.go:334] "Generic (PLEG): container finished" podID="c065c328-37e2-4905-9d1e-82208eab196e" containerID="8f6a3c52d28bf57cc459d662dd200afc3fe2897a7b69316267619435cbdac8b9" exitCode=0 Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.363599 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c065c328-37e2-4905-9d1e-82208eab196e","Type":"ContainerDied","Data":"8f6a3c52d28bf57cc459d662dd200afc3fe2897a7b69316267619435cbdac8b9"} Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.363648 4771 scope.go:117] "RemoveContainer" containerID="46cae5d55136c70e0709cacbf082bf077e68ce5a3bc258945276fab541b2cf08" Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.364589 4771 scope.go:117] "RemoveContainer" containerID="8f6a3c52d28bf57cc459d662dd200afc3fe2897a7b69316267619435cbdac8b9" Mar 19 15:35:44 crc kubenswrapper[4771]: E0319 15:35:44.364895 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 10s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.368133 4771 generic.go:334] "Generic (PLEG): container finished" podID="74c5f622-0ced-47f9-80d5-75a09acfafc0" containerID="82f408c9a22bfb4646611de514397e4072fd17ead089d0bd1dc250f8a4865cb2" exitCode=0 Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.368278 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74c5f622-0ced-47f9-80d5-75a09acfafc0","Type":"ContainerDied","Data":"82f408c9a22bfb4646611de514397e4072fd17ead089d0bd1dc250f8a4865cb2"} Mar 19 15:35:44 crc kubenswrapper[4771]: I0319 15:35:44.369161 4771 scope.go:117] "RemoveContainer" containerID="82f408c9a22bfb4646611de514397e4072fd17ead089d0bd1dc250f8a4865cb2" Mar 19 15:35:44 crc kubenswrapper[4771]: E0319 15:35:44.369420 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 10s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:35:46 crc kubenswrapper[4771]: I0319 15:35:46.777956 4771 scope.go:117] "RemoveContainer" containerID="d9f43f1e7de39a493f7959b4f45870b1b227f88706b71e937eaab9ff6aaa0c04" Mar 19 15:35:47 crc kubenswrapper[4771]: I0319 15:35:47.229354 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-78dl7"] Mar 19 15:35:47 crc kubenswrapper[4771]: W0319 15:35:47.234281 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e5b34d9_0789_42e6_aa3d_1f1996f6befc.slice/crio-afdb4fbb6d4c49881b43e2a8c4d2c192165e2487d460e99df5ef994ccb90a7f9 WatchSource:0}: Error finding container afdb4fbb6d4c49881b43e2a8c4d2c192165e2487d460e99df5ef994ccb90a7f9: Status 404 returned error can't find the container with id afdb4fbb6d4c49881b43e2a8c4d2c192165e2487d460e99df5ef994ccb90a7f9 Mar 19 15:35:47 crc kubenswrapper[4771]: I0319 15:35:47.242229 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 19 15:35:47 crc kubenswrapper[4771]: I0319 15:35:47.296188 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-w5jsx-config-7tzlt"] Mar 19 15:35:47 crc kubenswrapper[4771]: W0319 15:35:47.318843 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfed51b11_5d52_4863_9987_3d819310fc2f.slice/crio-3ce3267ca23eebe7448cec25a79c3f09d23b65ae7e5942067a7f05eb170e49ea WatchSource:0}: Error finding container 3ce3267ca23eebe7448cec25a79c3f09d23b65ae7e5942067a7f05eb170e49ea: Status 404 returned error can't find the container with id 3ce3267ca23eebe7448cec25a79c3f09d23b65ae7e5942067a7f05eb170e49ea Mar 19 15:35:47 crc kubenswrapper[4771]: I0319 15:35:47.403294 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-w5jsx-config-7tzlt" event={"ID":"fed51b11-5d52-4863-9987-3d819310fc2f","Type":"ContainerStarted","Data":"3ce3267ca23eebe7448cec25a79c3f09d23b65ae7e5942067a7f05eb170e49ea"} Mar 19 15:35:47 crc kubenswrapper[4771]: I0319 15:35:47.408831 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67d58e24-649b-4142-a62a-64c9919fe0e4","Type":"ContainerStarted","Data":"fd52165971ccaa5f10c21214149bde2ba624b5b8fe9c36ac5c7509ff1beeaac0"} Mar 19 15:35:47 crc kubenswrapper[4771]: I0319 15:35:47.408859 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67d58e24-649b-4142-a62a-64c9919fe0e4","Type":"ContainerStarted","Data":"a86d8ad09a9b1988e99e2b9c8d7120f78c64248ef328777516c32cf968e718fe"} Mar 19 15:35:47 crc kubenswrapper[4771]: I0319 15:35:47.408870 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67d58e24-649b-4142-a62a-64c9919fe0e4","Type":"ContainerStarted","Data":"de4d37d5039d8f9a37b88bece9ed0464de8ee2ef185d05446310f61aadc4d8f8"} Mar 19 15:35:47 crc kubenswrapper[4771]: I0319 15:35:47.413850 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-78dl7" event={"ID":"1e5b34d9-0789-42e6-aa3d-1f1996f6befc","Type":"ContainerStarted","Data":"afdb4fbb6d4c49881b43e2a8c4d2c192165e2487d460e99df5ef994ccb90a7f9"} Mar 19 15:35:48 crc kubenswrapper[4771]: I0319 15:35:48.425182 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mwcmg" event={"ID":"04404b99-3d2c-42a9-8137-f9f7991d2d06","Type":"ContainerStarted","Data":"134c2c9bc05f58f8fe7f26996e5f4fb6c536a57fbcb2e4ab153ee21931c8bdd3"} Mar 19 15:35:48 crc kubenswrapper[4771]: I0319 15:35:48.426928 4771 generic.go:334] "Generic (PLEG): container finished" podID="1e5b34d9-0789-42e6-aa3d-1f1996f6befc" containerID="f7d38bd03d7a2891b81764d6f808df090460386a76acc1db6fd8eb781b49f624" exitCode=0 Mar 19 15:35:48 crc kubenswrapper[4771]: I0319 15:35:48.426976 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-78dl7" event={"ID":"1e5b34d9-0789-42e6-aa3d-1f1996f6befc","Type":"ContainerDied","Data":"f7d38bd03d7a2891b81764d6f808df090460386a76acc1db6fd8eb781b49f624"} Mar 19 15:35:48 crc kubenswrapper[4771]: I0319 15:35:48.428514 4771 generic.go:334] "Generic (PLEG): container finished" podID="fed51b11-5d52-4863-9987-3d819310fc2f" containerID="2a93cacee0d6856fe3c176fc688b030feb95f2cc7ac9b364893de25c5997cc63" exitCode=0 Mar 19 15:35:48 crc kubenswrapper[4771]: I0319 15:35:48.428629 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-w5jsx-config-7tzlt" event={"ID":"fed51b11-5d52-4863-9987-3d819310fc2f","Type":"ContainerDied","Data":"2a93cacee0d6856fe3c176fc688b030feb95f2cc7ac9b364893de25c5997cc63"} Mar 19 15:35:48 crc kubenswrapper[4771]: I0319 15:35:48.433360 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67d58e24-649b-4142-a62a-64c9919fe0e4","Type":"ContainerStarted","Data":"9645eb8ac54693ad95e23fe33b1d9bb73ee3c82fefd9d3fe05f3c35ea4ac526d"} Mar 19 15:35:48 crc kubenswrapper[4771]: I0319 15:35:48.446728 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-78dl7" podStartSLOduration=8.446710113 podStartE2EDuration="8.446710113s" podCreationTimestamp="2026-03-19 15:35:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:35:47.431185187 +0000 UTC m=+1206.659806419" watchObservedRunningTime="2026-03-19 15:35:48.446710113 +0000 UTC m=+1207.675331315" Mar 19 15:35:48 crc kubenswrapper[4771]: I0319 15:35:48.485820 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-mwcmg" podStartSLOduration=3.79967453 podStartE2EDuration="15.485800989s" podCreationTimestamp="2026-03-19 15:35:33 +0000 UTC" firstStartedPulling="2026-03-19 15:35:35.177465543 +0000 UTC m=+1194.406086745" lastFinishedPulling="2026-03-19 15:35:46.863591982 +0000 UTC m=+1206.092213204" observedRunningTime="2026-03-19 15:35:48.456507907 +0000 UTC m=+1207.685129109" watchObservedRunningTime="2026-03-19 15:35:48.485800989 +0000 UTC m=+1207.714422191" Mar 19 15:35:48 crc kubenswrapper[4771]: I0319 15:35:48.792299 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-w5jsx" Mar 19 15:35:49 crc kubenswrapper[4771]: I0319 15:35:49.828294 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-w5jsx-config-7tzlt" Mar 19 15:35:49 crc kubenswrapper[4771]: I0319 15:35:49.834732 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-78dl7" Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:49.996577 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtxm6\" (UniqueName: \"kubernetes.io/projected/fed51b11-5d52-4863-9987-3d819310fc2f-kube-api-access-wtxm6\") pod \"fed51b11-5d52-4863-9987-3d819310fc2f\" (UID: \"fed51b11-5d52-4863-9987-3d819310fc2f\") " Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:49.996933 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e5b34d9-0789-42e6-aa3d-1f1996f6befc-operator-scripts\") pod \"1e5b34d9-0789-42e6-aa3d-1f1996f6befc\" (UID: \"1e5b34d9-0789-42e6-aa3d-1f1996f6befc\") " Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:49.996974 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fed51b11-5d52-4863-9987-3d819310fc2f-var-log-ovn\") pod \"fed51b11-5d52-4863-9987-3d819310fc2f\" (UID: \"fed51b11-5d52-4863-9987-3d819310fc2f\") " Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:49.997010 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fed51b11-5d52-4863-9987-3d819310fc2f-additional-scripts\") pod \"fed51b11-5d52-4863-9987-3d819310fc2f\" (UID: \"fed51b11-5d52-4863-9987-3d819310fc2f\") " Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:49.997071 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fed51b11-5d52-4863-9987-3d819310fc2f-var-run\") pod \"fed51b11-5d52-4863-9987-3d819310fc2f\" (UID: \"fed51b11-5d52-4863-9987-3d819310fc2f\") " Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:49.997156 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fed51b11-5d52-4863-9987-3d819310fc2f-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "fed51b11-5d52-4863-9987-3d819310fc2f" (UID: "fed51b11-5d52-4863-9987-3d819310fc2f"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:49.997196 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fed51b11-5d52-4863-9987-3d819310fc2f-var-run-ovn\") pod \"fed51b11-5d52-4863-9987-3d819310fc2f\" (UID: \"fed51b11-5d52-4863-9987-3d819310fc2f\") " Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:49.997224 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fed51b11-5d52-4863-9987-3d819310fc2f-var-run" (OuterVolumeSpecName: "var-run") pod "fed51b11-5d52-4863-9987-3d819310fc2f" (UID: "fed51b11-5d52-4863-9987-3d819310fc2f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:49.997256 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fed51b11-5d52-4863-9987-3d819310fc2f-scripts\") pod \"fed51b11-5d52-4863-9987-3d819310fc2f\" (UID: \"fed51b11-5d52-4863-9987-3d819310fc2f\") " Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:49.997275 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcbwb\" (UniqueName: \"kubernetes.io/projected/1e5b34d9-0789-42e6-aa3d-1f1996f6befc-kube-api-access-lcbwb\") pod \"1e5b34d9-0789-42e6-aa3d-1f1996f6befc\" (UID: \"1e5b34d9-0789-42e6-aa3d-1f1996f6befc\") " Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:49.997333 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fed51b11-5d52-4863-9987-3d819310fc2f-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "fed51b11-5d52-4863-9987-3d819310fc2f" (UID: "fed51b11-5d52-4863-9987-3d819310fc2f"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:49.997615 4771 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fed51b11-5d52-4863-9987-3d819310fc2f-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:49.997628 4771 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fed51b11-5d52-4863-9987-3d819310fc2f-var-run\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:49.997637 4771 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fed51b11-5d52-4863-9987-3d819310fc2f-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:49.997842 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e5b34d9-0789-42e6-aa3d-1f1996f6befc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1e5b34d9-0789-42e6-aa3d-1f1996f6befc" (UID: "1e5b34d9-0789-42e6-aa3d-1f1996f6befc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:49.997842 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fed51b11-5d52-4863-9987-3d819310fc2f-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "fed51b11-5d52-4863-9987-3d819310fc2f" (UID: "fed51b11-5d52-4863-9987-3d819310fc2f"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:49.998386 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fed51b11-5d52-4863-9987-3d819310fc2f-scripts" (OuterVolumeSpecName: "scripts") pod "fed51b11-5d52-4863-9987-3d819310fc2f" (UID: "fed51b11-5d52-4863-9987-3d819310fc2f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:50.000659 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fed51b11-5d52-4863-9987-3d819310fc2f-kube-api-access-wtxm6" (OuterVolumeSpecName: "kube-api-access-wtxm6") pod "fed51b11-5d52-4863-9987-3d819310fc2f" (UID: "fed51b11-5d52-4863-9987-3d819310fc2f"). InnerVolumeSpecName "kube-api-access-wtxm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:50.002619 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e5b34d9-0789-42e6-aa3d-1f1996f6befc-kube-api-access-lcbwb" (OuterVolumeSpecName: "kube-api-access-lcbwb") pod "1e5b34d9-0789-42e6-aa3d-1f1996f6befc" (UID: "1e5b34d9-0789-42e6-aa3d-1f1996f6befc"). InnerVolumeSpecName "kube-api-access-lcbwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:50.099264 4771 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fed51b11-5d52-4863-9987-3d819310fc2f-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:50.099293 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fed51b11-5d52-4863-9987-3d819310fc2f-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:50.099320 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcbwb\" (UniqueName: \"kubernetes.io/projected/1e5b34d9-0789-42e6-aa3d-1f1996f6befc-kube-api-access-lcbwb\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:50.099331 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtxm6\" (UniqueName: \"kubernetes.io/projected/fed51b11-5d52-4863-9987-3d819310fc2f-kube-api-access-wtxm6\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:50.099340 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e5b34d9-0789-42e6-aa3d-1f1996f6befc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:50.455711 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67d58e24-649b-4142-a62a-64c9919fe0e4","Type":"ContainerStarted","Data":"7c7da1af3e0c8c7ef0a5a727170886dd1daa6e7bef4b2dd2c46c64ac67526f3c"} Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:50.455778 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67d58e24-649b-4142-a62a-64c9919fe0e4","Type":"ContainerStarted","Data":"94cc4c78e45e397a8ddbcef051972d2afa6f146548a0244c368e29fd12346b1e"} Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:50.457703 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-78dl7" event={"ID":"1e5b34d9-0789-42e6-aa3d-1f1996f6befc","Type":"ContainerDied","Data":"afdb4fbb6d4c49881b43e2a8c4d2c192165e2487d460e99df5ef994ccb90a7f9"} Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:50.457738 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afdb4fbb6d4c49881b43e2a8c4d2c192165e2487d460e99df5ef994ccb90a7f9" Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:50.457766 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-78dl7" Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:50.460159 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-w5jsx-config-7tzlt" event={"ID":"fed51b11-5d52-4863-9987-3d819310fc2f","Type":"ContainerDied","Data":"3ce3267ca23eebe7448cec25a79c3f09d23b65ae7e5942067a7f05eb170e49ea"} Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:50.460194 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ce3267ca23eebe7448cec25a79c3f09d23b65ae7e5942067a7f05eb170e49ea" Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:50.460272 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-w5jsx-config-7tzlt" Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:50.935754 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-w5jsx-config-7tzlt"] Mar 19 15:35:50 crc kubenswrapper[4771]: I0319 15:35:50.936160 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-w5jsx-config-7tzlt"] Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.046678 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-w5jsx-config-p6skt"] Mar 19 15:35:51 crc kubenswrapper[4771]: E0319 15:35:51.049067 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e5b34d9-0789-42e6-aa3d-1f1996f6befc" containerName="mariadb-account-create-update" Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.049090 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e5b34d9-0789-42e6-aa3d-1f1996f6befc" containerName="mariadb-account-create-update" Mar 19 15:35:51 crc kubenswrapper[4771]: E0319 15:35:51.049125 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fed51b11-5d52-4863-9987-3d819310fc2f" containerName="ovn-config" Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.049132 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="fed51b11-5d52-4863-9987-3d819310fc2f" containerName="ovn-config" Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.049271 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="fed51b11-5d52-4863-9987-3d819310fc2f" containerName="ovn-config" Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.049296 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e5b34d9-0789-42e6-aa3d-1f1996f6befc" containerName="mariadb-account-create-update" Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.049821 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-w5jsx-config-p6skt" Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.063357 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.071787 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-w5jsx-config-p6skt"] Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.121684 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-scripts\") pod \"ovn-controller-w5jsx-config-p6skt\" (UID: \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\") " pod="openstack/ovn-controller-w5jsx-config-p6skt" Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.121755 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56qpv\" (UniqueName: \"kubernetes.io/projected/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-kube-api-access-56qpv\") pod \"ovn-controller-w5jsx-config-p6skt\" (UID: \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\") " pod="openstack/ovn-controller-w5jsx-config-p6skt" Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.121788 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-additional-scripts\") pod \"ovn-controller-w5jsx-config-p6skt\" (UID: \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\") " pod="openstack/ovn-controller-w5jsx-config-p6skt" Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.121900 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-var-run\") pod \"ovn-controller-w5jsx-config-p6skt\" (UID: \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\") " pod="openstack/ovn-controller-w5jsx-config-p6skt" Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.121972 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-var-log-ovn\") pod \"ovn-controller-w5jsx-config-p6skt\" (UID: \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\") " pod="openstack/ovn-controller-w5jsx-config-p6skt" Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.122011 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-var-run-ovn\") pod \"ovn-controller-w5jsx-config-p6skt\" (UID: \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\") " pod="openstack/ovn-controller-w5jsx-config-p6skt" Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.223342 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-var-run\") pod \"ovn-controller-w5jsx-config-p6skt\" (UID: \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\") " pod="openstack/ovn-controller-w5jsx-config-p6skt" Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.223415 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-var-log-ovn\") pod \"ovn-controller-w5jsx-config-p6skt\" (UID: \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\") " pod="openstack/ovn-controller-w5jsx-config-p6skt" Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.223438 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-var-run-ovn\") pod \"ovn-controller-w5jsx-config-p6skt\" (UID: \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\") " pod="openstack/ovn-controller-w5jsx-config-p6skt" Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.223605 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-scripts\") pod \"ovn-controller-w5jsx-config-p6skt\" (UID: \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\") " pod="openstack/ovn-controller-w5jsx-config-p6skt" Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.223698 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56qpv\" (UniqueName: \"kubernetes.io/projected/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-kube-api-access-56qpv\") pod \"ovn-controller-w5jsx-config-p6skt\" (UID: \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\") " pod="openstack/ovn-controller-w5jsx-config-p6skt" Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.223718 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-additional-scripts\") pod \"ovn-controller-w5jsx-config-p6skt\" (UID: \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\") " pod="openstack/ovn-controller-w5jsx-config-p6skt" Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.223717 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-var-run\") pod \"ovn-controller-w5jsx-config-p6skt\" (UID: \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\") " pod="openstack/ovn-controller-w5jsx-config-p6skt" Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.224076 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-var-run-ovn\") pod \"ovn-controller-w5jsx-config-p6skt\" (UID: \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\") " pod="openstack/ovn-controller-w5jsx-config-p6skt" Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.224517 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-additional-scripts\") pod \"ovn-controller-w5jsx-config-p6skt\" (UID: \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\") " pod="openstack/ovn-controller-w5jsx-config-p6skt" Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.226203 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-scripts\") pod \"ovn-controller-w5jsx-config-p6skt\" (UID: \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\") " pod="openstack/ovn-controller-w5jsx-config-p6skt" Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.226275 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-var-log-ovn\") pod \"ovn-controller-w5jsx-config-p6skt\" (UID: \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\") " pod="openstack/ovn-controller-w5jsx-config-p6skt" Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.241117 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56qpv\" (UniqueName: \"kubernetes.io/projected/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-kube-api-access-56qpv\") pod \"ovn-controller-w5jsx-config-p6skt\" (UID: \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\") " pod="openstack/ovn-controller-w5jsx-config-p6skt" Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.398781 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-w5jsx-config-p6skt" Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.492366 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67d58e24-649b-4142-a62a-64c9919fe0e4","Type":"ContainerStarted","Data":"f13df092547ed04ecc37bd635128fdf978c0adab81dd1abb76feea091a67515b"} Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.492413 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67d58e24-649b-4142-a62a-64c9919fe0e4","Type":"ContainerStarted","Data":"86ae6f24a7b200642b51418ab463562e779ba21427426d58a16058429e58ca3b"} Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.492427 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67d58e24-649b-4142-a62a-64c9919fe0e4","Type":"ContainerStarted","Data":"aa108656a094f9b5391e8c51ec9a63a50a4dbd2c7a12f4afc36edc7a5f193c6f"} Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.492437 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67d58e24-649b-4142-a62a-64c9919fe0e4","Type":"ContainerStarted","Data":"de2b4707382315a03cfbfd2f7151ce99030555f4f55412832ee8ee1b587e38b3"} Mar 19 15:35:51 crc kubenswrapper[4771]: I0319 15:35:51.548678 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fed51b11-5d52-4863-9987-3d819310fc2f" path="/var/lib/kubelet/pods/fed51b11-5d52-4863-9987-3d819310fc2f/volumes" Mar 19 15:35:52 crc kubenswrapper[4771]: I0319 15:35:51.646179 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-w5jsx-config-p6skt"] Mar 19 15:35:52 crc kubenswrapper[4771]: I0319 15:35:51.667046 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-78dl7"] Mar 19 15:35:52 crc kubenswrapper[4771]: I0319 15:35:51.695162 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-78dl7"] Mar 19 15:35:52 crc kubenswrapper[4771]: I0319 15:35:52.503337 4771 generic.go:334] "Generic (PLEG): container finished" podID="4341e49f-b5f9-41fa-ab77-92b4f39b34d9" containerID="74db2422ce5d601da90ad2006b630bc77879d8f86b62fd4c7b4adc48c3de604d" exitCode=0 Mar 19 15:35:52 crc kubenswrapper[4771]: I0319 15:35:52.503409 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-w5jsx-config-p6skt" event={"ID":"4341e49f-b5f9-41fa-ab77-92b4f39b34d9","Type":"ContainerDied","Data":"74db2422ce5d601da90ad2006b630bc77879d8f86b62fd4c7b4adc48c3de604d"} Mar 19 15:35:52 crc kubenswrapper[4771]: I0319 15:35:52.503821 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-w5jsx-config-p6skt" event={"ID":"4341e49f-b5f9-41fa-ab77-92b4f39b34d9","Type":"ContainerStarted","Data":"2c9a4ccf2b8c8091d29ff9f79083f082537695de93becc4412fd1fccc6eaedf4"} Mar 19 15:35:52 crc kubenswrapper[4771]: I0319 15:35:52.513717 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67d58e24-649b-4142-a62a-64c9919fe0e4","Type":"ContainerStarted","Data":"127121fc50dd925aee354db0a1f2d2a621035f0384000d98edb7c2c3f98d2cdb"} Mar 19 15:35:52 crc kubenswrapper[4771]: I0319 15:35:52.600214 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=21.423428547 podStartE2EDuration="37.600195901s" podCreationTimestamp="2026-03-19 15:35:15 +0000 UTC" firstStartedPulling="2026-03-19 15:35:33.4781901 +0000 UTC m=+1192.706811302" lastFinishedPulling="2026-03-19 15:35:49.654957454 +0000 UTC m=+1208.883578656" observedRunningTime="2026-03-19 15:35:52.594195088 +0000 UTC m=+1211.822816340" watchObservedRunningTime="2026-03-19 15:35:52.600195901 +0000 UTC m=+1211.828817113" Mar 19 15:35:52 crc kubenswrapper[4771]: I0319 15:35:52.925719 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-bktbk"] Mar 19 15:35:52 crc kubenswrapper[4771]: I0319 15:35:52.927314 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" Mar 19 15:35:52 crc kubenswrapper[4771]: I0319 15:35:52.929505 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 19 15:35:52 crc kubenswrapper[4771]: I0319 15:35:52.945526 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-bktbk"] Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.027101 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.027175 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.059106 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-bktbk\" (UID: \"d5517f09-737f-4a05-8293-d646d035c4ea\") " pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.059151 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-bktbk\" (UID: \"d5517f09-737f-4a05-8293-d646d035c4ea\") " pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.059200 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-bktbk\" (UID: \"d5517f09-737f-4a05-8293-d646d035c4ea\") " pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.059258 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-config\") pod \"dnsmasq-dns-77585f5f8c-bktbk\" (UID: \"d5517f09-737f-4a05-8293-d646d035c4ea\") " pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.059274 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-bktbk\" (UID: \"d5517f09-737f-4a05-8293-d646d035c4ea\") " pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.059298 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wplhc\" (UniqueName: \"kubernetes.io/projected/d5517f09-737f-4a05-8293-d646d035c4ea-kube-api-access-wplhc\") pod \"dnsmasq-dns-77585f5f8c-bktbk\" (UID: \"d5517f09-737f-4a05-8293-d646d035c4ea\") " pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.160627 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-bktbk\" (UID: \"d5517f09-737f-4a05-8293-d646d035c4ea\") " pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.160671 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-bktbk\" (UID: \"d5517f09-737f-4a05-8293-d646d035c4ea\") " pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.160720 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-bktbk\" (UID: \"d5517f09-737f-4a05-8293-d646d035c4ea\") " pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.160754 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-config\") pod \"dnsmasq-dns-77585f5f8c-bktbk\" (UID: \"d5517f09-737f-4a05-8293-d646d035c4ea\") " pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.160793 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-bktbk\" (UID: \"d5517f09-737f-4a05-8293-d646d035c4ea\") " pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.160825 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wplhc\" (UniqueName: \"kubernetes.io/projected/d5517f09-737f-4a05-8293-d646d035c4ea-kube-api-access-wplhc\") pod \"dnsmasq-dns-77585f5f8c-bktbk\" (UID: \"d5517f09-737f-4a05-8293-d646d035c4ea\") " pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.161923 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-bktbk\" (UID: \"d5517f09-737f-4a05-8293-d646d035c4ea\") " pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.162441 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-bktbk\" (UID: \"d5517f09-737f-4a05-8293-d646d035c4ea\") " pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.162936 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-bktbk\" (UID: \"d5517f09-737f-4a05-8293-d646d035c4ea\") " pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.163500 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-bktbk\" (UID: \"d5517f09-737f-4a05-8293-d646d035c4ea\") " pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.163952 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-config\") pod \"dnsmasq-dns-77585f5f8c-bktbk\" (UID: \"d5517f09-737f-4a05-8293-d646d035c4ea\") " pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.186262 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wplhc\" (UniqueName: \"kubernetes.io/projected/d5517f09-737f-4a05-8293-d646d035c4ea-kube-api-access-wplhc\") pod \"dnsmasq-dns-77585f5f8c-bktbk\" (UID: \"d5517f09-737f-4a05-8293-d646d035c4ea\") " pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.243188 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.521893 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e5b34d9-0789-42e6-aa3d-1f1996f6befc" path="/var/lib/kubelet/pods/1e5b34d9-0789-42e6-aa3d-1f1996f6befc/volumes" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.703265 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-bktbk"] Mar 19 15:35:53 crc kubenswrapper[4771]: W0319 15:35:53.711974 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5517f09_737f_4a05_8293_d646d035c4ea.slice/crio-83e5a3e667096ddb9c0c0f8031dcc3b97fa310968e53900f9fc3f52b40a53529 WatchSource:0}: Error finding container 83e5a3e667096ddb9c0c0f8031dcc3b97fa310968e53900f9fc3f52b40a53529: Status 404 returned error can't find the container with id 83e5a3e667096ddb9c0c0f8031dcc3b97fa310968e53900f9fc3f52b40a53529 Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.811630 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-w5jsx-config-p6skt" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.884036 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-scripts\") pod \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\" (UID: \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\") " Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.884084 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-var-run-ovn\") pod \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\" (UID: \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\") " Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.884153 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-additional-scripts\") pod \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\" (UID: \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\") " Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.884197 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56qpv\" (UniqueName: \"kubernetes.io/projected/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-kube-api-access-56qpv\") pod \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\" (UID: \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\") " Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.884234 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-var-log-ovn\") pod \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\" (UID: \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\") " Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.884392 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-var-run\") pod \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\" (UID: \"4341e49f-b5f9-41fa-ab77-92b4f39b34d9\") " Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.884795 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-var-run" (OuterVolumeSpecName: "var-run") pod "4341e49f-b5f9-41fa-ab77-92b4f39b34d9" (UID: "4341e49f-b5f9-41fa-ab77-92b4f39b34d9"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.885514 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "4341e49f-b5f9-41fa-ab77-92b4f39b34d9" (UID: "4341e49f-b5f9-41fa-ab77-92b4f39b34d9"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.885739 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "4341e49f-b5f9-41fa-ab77-92b4f39b34d9" (UID: "4341e49f-b5f9-41fa-ab77-92b4f39b34d9"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.886318 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "4341e49f-b5f9-41fa-ab77-92b4f39b34d9" (UID: "4341e49f-b5f9-41fa-ab77-92b4f39b34d9"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.886512 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-scripts" (OuterVolumeSpecName: "scripts") pod "4341e49f-b5f9-41fa-ab77-92b4f39b34d9" (UID: "4341e49f-b5f9-41fa-ab77-92b4f39b34d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.889251 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-kube-api-access-56qpv" (OuterVolumeSpecName: "kube-api-access-56qpv") pod "4341e49f-b5f9-41fa-ab77-92b4f39b34d9" (UID: "4341e49f-b5f9-41fa-ab77-92b4f39b34d9"). InnerVolumeSpecName "kube-api-access-56qpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.985970 4771 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-var-run\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.986027 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.986038 4771 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.986048 4771 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.986058 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56qpv\" (UniqueName: \"kubernetes.io/projected/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-kube-api-access-56qpv\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:53 crc kubenswrapper[4771]: I0319 15:35:53.986068 4771 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4341e49f-b5f9-41fa-ab77-92b4f39b34d9-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 15:35:54 crc kubenswrapper[4771]: I0319 15:35:54.537128 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-w5jsx-config-p6skt" event={"ID":"4341e49f-b5f9-41fa-ab77-92b4f39b34d9","Type":"ContainerDied","Data":"2c9a4ccf2b8c8091d29ff9f79083f082537695de93becc4412fd1fccc6eaedf4"} Mar 19 15:35:54 crc kubenswrapper[4771]: I0319 15:35:54.537456 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c9a4ccf2b8c8091d29ff9f79083f082537695de93becc4412fd1fccc6eaedf4" Mar 19 15:35:54 crc kubenswrapper[4771]: I0319 15:35:54.537200 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-w5jsx-config-p6skt" Mar 19 15:35:54 crc kubenswrapper[4771]: I0319 15:35:54.540502 4771 generic.go:334] "Generic (PLEG): container finished" podID="d5517f09-737f-4a05-8293-d646d035c4ea" containerID="f19f47f7621886863d38d7f0bcdea4d8605c7b62ed90a4160cbba9353ac304a2" exitCode=0 Mar 19 15:35:54 crc kubenswrapper[4771]: I0319 15:35:54.540561 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" event={"ID":"d5517f09-737f-4a05-8293-d646d035c4ea","Type":"ContainerDied","Data":"f19f47f7621886863d38d7f0bcdea4d8605c7b62ed90a4160cbba9353ac304a2"} Mar 19 15:35:54 crc kubenswrapper[4771]: I0319 15:35:54.540600 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" event={"ID":"d5517f09-737f-4a05-8293-d646d035c4ea","Type":"ContainerStarted","Data":"83e5a3e667096ddb9c0c0f8031dcc3b97fa310968e53900f9fc3f52b40a53529"} Mar 19 15:35:54 crc kubenswrapper[4771]: I0319 15:35:54.897210 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-w5jsx-config-p6skt"] Mar 19 15:35:54 crc kubenswrapper[4771]: I0319 15:35:54.906972 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-w5jsx-config-p6skt"] Mar 19 15:35:55 crc kubenswrapper[4771]: I0319 15:35:55.524866 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4341e49f-b5f9-41fa-ab77-92b4f39b34d9" path="/var/lib/kubelet/pods/4341e49f-b5f9-41fa-ab77-92b4f39b34d9/volumes" Mar 19 15:35:55 crc kubenswrapper[4771]: I0319 15:35:55.551533 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" event={"ID":"d5517f09-737f-4a05-8293-d646d035c4ea","Type":"ContainerStarted","Data":"1c02f0a334d1c859ae293acb2f35e7f9444215760af8f44efe70337cb9a936dc"} Mar 19 15:35:55 crc kubenswrapper[4771]: I0319 15:35:55.552017 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" Mar 19 15:35:55 crc kubenswrapper[4771]: I0319 15:35:55.575394 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" podStartSLOduration=3.575378256 podStartE2EDuration="3.575378256s" podCreationTimestamp="2026-03-19 15:35:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:35:55.570363346 +0000 UTC m=+1214.798984558" watchObservedRunningTime="2026-03-19 15:35:55.575378256 +0000 UTC m=+1214.803999458" Mar 19 15:35:56 crc kubenswrapper[4771]: I0319 15:35:56.741466 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-275z8"] Mar 19 15:35:56 crc kubenswrapper[4771]: E0319 15:35:56.741865 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4341e49f-b5f9-41fa-ab77-92b4f39b34d9" containerName="ovn-config" Mar 19 15:35:56 crc kubenswrapper[4771]: I0319 15:35:56.741881 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4341e49f-b5f9-41fa-ab77-92b4f39b34d9" containerName="ovn-config" Mar 19 15:35:56 crc kubenswrapper[4771]: I0319 15:35:56.742161 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4341e49f-b5f9-41fa-ab77-92b4f39b34d9" containerName="ovn-config" Mar 19 15:35:56 crc kubenswrapper[4771]: I0319 15:35:56.742700 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-275z8"] Mar 19 15:35:56 crc kubenswrapper[4771]: I0319 15:35:56.742790 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-275z8" Mar 19 15:35:56 crc kubenswrapper[4771]: I0319 15:35:56.762101 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 19 15:35:56 crc kubenswrapper[4771]: I0319 15:35:56.862097 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvbsv\" (UniqueName: \"kubernetes.io/projected/7a0387e9-2daa-4556-9745-450c0e3015dd-kube-api-access-cvbsv\") pod \"root-account-create-update-275z8\" (UID: \"7a0387e9-2daa-4556-9745-450c0e3015dd\") " pod="openstack/root-account-create-update-275z8" Mar 19 15:35:56 crc kubenswrapper[4771]: I0319 15:35:56.862205 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a0387e9-2daa-4556-9745-450c0e3015dd-operator-scripts\") pod \"root-account-create-update-275z8\" (UID: \"7a0387e9-2daa-4556-9745-450c0e3015dd\") " pod="openstack/root-account-create-update-275z8" Mar 19 15:35:56 crc kubenswrapper[4771]: I0319 15:35:56.964571 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvbsv\" (UniqueName: \"kubernetes.io/projected/7a0387e9-2daa-4556-9745-450c0e3015dd-kube-api-access-cvbsv\") pod \"root-account-create-update-275z8\" (UID: \"7a0387e9-2daa-4556-9745-450c0e3015dd\") " pod="openstack/root-account-create-update-275z8" Mar 19 15:35:56 crc kubenswrapper[4771]: I0319 15:35:56.964719 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a0387e9-2daa-4556-9745-450c0e3015dd-operator-scripts\") pod \"root-account-create-update-275z8\" (UID: \"7a0387e9-2daa-4556-9745-450c0e3015dd\") " pod="openstack/root-account-create-update-275z8" Mar 19 15:35:56 crc kubenswrapper[4771]: I0319 15:35:56.966425 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a0387e9-2daa-4556-9745-450c0e3015dd-operator-scripts\") pod \"root-account-create-update-275z8\" (UID: \"7a0387e9-2daa-4556-9745-450c0e3015dd\") " pod="openstack/root-account-create-update-275z8" Mar 19 15:35:56 crc kubenswrapper[4771]: I0319 15:35:56.992645 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvbsv\" (UniqueName: \"kubernetes.io/projected/7a0387e9-2daa-4556-9745-450c0e3015dd-kube-api-access-cvbsv\") pod \"root-account-create-update-275z8\" (UID: \"7a0387e9-2daa-4556-9745-450c0e3015dd\") " pod="openstack/root-account-create-update-275z8" Mar 19 15:35:57 crc kubenswrapper[4771]: I0319 15:35:57.077277 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-275z8" Mar 19 15:35:57 crc kubenswrapper[4771]: I0319 15:35:57.517374 4771 scope.go:117] "RemoveContainer" containerID="82f408c9a22bfb4646611de514397e4072fd17ead089d0bd1dc250f8a4865cb2" Mar 19 15:35:57 crc kubenswrapper[4771]: W0319 15:35:57.536697 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a0387e9_2daa_4556_9745_450c0e3015dd.slice/crio-75cb80b3821d645194181953cef2186e244f9d21bdd004de5264ddacaba9fe7c WatchSource:0}: Error finding container 75cb80b3821d645194181953cef2186e244f9d21bdd004de5264ddacaba9fe7c: Status 404 returned error can't find the container with id 75cb80b3821d645194181953cef2186e244f9d21bdd004de5264ddacaba9fe7c Mar 19 15:35:57 crc kubenswrapper[4771]: I0319 15:35:57.537154 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-275z8"] Mar 19 15:35:57 crc kubenswrapper[4771]: I0319 15:35:57.570698 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-275z8" event={"ID":"7a0387e9-2daa-4556-9745-450c0e3015dd","Type":"ContainerStarted","Data":"75cb80b3821d645194181953cef2186e244f9d21bdd004de5264ddacaba9fe7c"} Mar 19 15:35:58 crc kubenswrapper[4771]: I0319 15:35:58.509347 4771 scope.go:117] "RemoveContainer" containerID="8f6a3c52d28bf57cc459d662dd200afc3fe2897a7b69316267619435cbdac8b9" Mar 19 15:36:00 crc kubenswrapper[4771]: I0319 15:36:00.133249 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565576-4rhn7"] Mar 19 15:36:00 crc kubenswrapper[4771]: I0319 15:36:00.136892 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565576-4rhn7" Mar 19 15:36:00 crc kubenswrapper[4771]: I0319 15:36:00.141761 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 15:36:00 crc kubenswrapper[4771]: I0319 15:36:00.142101 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k42k7" Mar 19 15:36:00 crc kubenswrapper[4771]: I0319 15:36:00.144195 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 15:36:00 crc kubenswrapper[4771]: I0319 15:36:00.149273 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565576-4rhn7"] Mar 19 15:36:00 crc kubenswrapper[4771]: I0319 15:36:00.219431 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8sx2\" (UniqueName: \"kubernetes.io/projected/523fdd36-adcb-4bde-9ba4-d0f5a9560caa-kube-api-access-k8sx2\") pod \"auto-csr-approver-29565576-4rhn7\" (UID: \"523fdd36-adcb-4bde-9ba4-d0f5a9560caa\") " pod="openshift-infra/auto-csr-approver-29565576-4rhn7" Mar 19 15:36:00 crc kubenswrapper[4771]: I0319 15:36:00.321096 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8sx2\" (UniqueName: \"kubernetes.io/projected/523fdd36-adcb-4bde-9ba4-d0f5a9560caa-kube-api-access-k8sx2\") pod \"auto-csr-approver-29565576-4rhn7\" (UID: \"523fdd36-adcb-4bde-9ba4-d0f5a9560caa\") " pod="openshift-infra/auto-csr-approver-29565576-4rhn7" Mar 19 15:36:00 crc kubenswrapper[4771]: I0319 15:36:00.340117 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8sx2\" (UniqueName: \"kubernetes.io/projected/523fdd36-adcb-4bde-9ba4-d0f5a9560caa-kube-api-access-k8sx2\") pod \"auto-csr-approver-29565576-4rhn7\" (UID: \"523fdd36-adcb-4bde-9ba4-d0f5a9560caa\") " pod="openshift-infra/auto-csr-approver-29565576-4rhn7" Mar 19 15:36:00 crc kubenswrapper[4771]: I0319 15:36:00.456852 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565576-4rhn7" Mar 19 15:36:01 crc kubenswrapper[4771]: I0319 15:36:01.047362 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565576-4rhn7"] Mar 19 15:36:01 crc kubenswrapper[4771]: W0319 15:36:01.058868 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod523fdd36_adcb_4bde_9ba4_d0f5a9560caa.slice/crio-235326203ccb1a370fb0a446027126e543f30351cda54fa358422755bc1d0520 WatchSource:0}: Error finding container 235326203ccb1a370fb0a446027126e543f30351cda54fa358422755bc1d0520: Status 404 returned error can't find the container with id 235326203ccb1a370fb0a446027126e543f30351cda54fa358422755bc1d0520 Mar 19 15:36:01 crc kubenswrapper[4771]: I0319 15:36:01.616285 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565576-4rhn7" event={"ID":"523fdd36-adcb-4bde-9ba4-d0f5a9560caa","Type":"ContainerStarted","Data":"235326203ccb1a370fb0a446027126e543f30351cda54fa358422755bc1d0520"} Mar 19 15:36:02 crc kubenswrapper[4771]: I0319 15:36:02.627943 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-275z8" event={"ID":"7a0387e9-2daa-4556-9745-450c0e3015dd","Type":"ContainerStarted","Data":"47c4d06dff1b7e394f1fd7c2fe6908fd24290a72bf595b2b00c23989d14771ee"} Mar 19 15:36:02 crc kubenswrapper[4771]: I0319 15:36:02.630646 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74c5f622-0ced-47f9-80d5-75a09acfafc0","Type":"ContainerStarted","Data":"6ae743143ba69d6b0b4a45930cf67ab154bf3857bc3a2b913bc411fef0b0bb62"} Mar 19 15:36:03 crc kubenswrapper[4771]: I0319 15:36:03.244907 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" Mar 19 15:36:03 crc kubenswrapper[4771]: I0319 15:36:03.310578 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-p67w5"] Mar 19 15:36:03 crc kubenswrapper[4771]: I0319 15:36:03.310858 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-p67w5" podUID="96edfd66-abb4-4935-adcf-22c80205e7c2" containerName="dnsmasq-dns" containerID="cri-o://c1957c063a8828a1191cca0264d683874c10c5562f461010bbde789460c7963c" gracePeriod=10 Mar 19 15:36:03 crc kubenswrapper[4771]: I0319 15:36:03.638784 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565576-4rhn7" event={"ID":"523fdd36-adcb-4bde-9ba4-d0f5a9560caa","Type":"ContainerStarted","Data":"918da1980407aa4504720ade6b61ebfc02466296fe98aed6dfcced04a6dbca69"} Mar 19 15:36:03 crc kubenswrapper[4771]: I0319 15:36:03.643548 4771 generic.go:334] "Generic (PLEG): container finished" podID="7a0387e9-2daa-4556-9745-450c0e3015dd" containerID="47c4d06dff1b7e394f1fd7c2fe6908fd24290a72bf595b2b00c23989d14771ee" exitCode=0 Mar 19 15:36:03 crc kubenswrapper[4771]: I0319 15:36:03.643687 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-275z8" event={"ID":"7a0387e9-2daa-4556-9745-450c0e3015dd","Type":"ContainerDied","Data":"47c4d06dff1b7e394f1fd7c2fe6908fd24290a72bf595b2b00c23989d14771ee"} Mar 19 15:36:03 crc kubenswrapper[4771]: I0319 15:36:03.650565 4771 generic.go:334] "Generic (PLEG): container finished" podID="96edfd66-abb4-4935-adcf-22c80205e7c2" containerID="c1957c063a8828a1191cca0264d683874c10c5562f461010bbde789460c7963c" exitCode=0 Mar 19 15:36:03 crc kubenswrapper[4771]: I0319 15:36:03.653095 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-p67w5" event={"ID":"96edfd66-abb4-4935-adcf-22c80205e7c2","Type":"ContainerDied","Data":"c1957c063a8828a1191cca0264d683874c10c5562f461010bbde789460c7963c"} Mar 19 15:36:03 crc kubenswrapper[4771]: I0319 15:36:03.653584 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565576-4rhn7" podStartSLOduration=1.513994695 podStartE2EDuration="3.653573264s" podCreationTimestamp="2026-03-19 15:36:00 +0000 UTC" firstStartedPulling="2026-03-19 15:36:01.061382143 +0000 UTC m=+1220.290003355" lastFinishedPulling="2026-03-19 15:36:03.200960712 +0000 UTC m=+1222.429581924" observedRunningTime="2026-03-19 15:36:03.651878003 +0000 UTC m=+1222.880499205" watchObservedRunningTime="2026-03-19 15:36:03.653573264 +0000 UTC m=+1222.882194466" Mar 19 15:36:03 crc kubenswrapper[4771]: I0319 15:36:03.656365 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c065c328-37e2-4905-9d1e-82208eab196e","Type":"ContainerStarted","Data":"2d902a4936e82a188e94b40a59bd5bd8dcecd25c29a3d32b7128d5438cccb48e"} Mar 19 15:36:03 crc kubenswrapper[4771]: I0319 15:36:03.656398 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 19 15:36:03 crc kubenswrapper[4771]: I0319 15:36:03.656801 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:36:03 crc kubenswrapper[4771]: I0319 15:36:03.875566 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-p67w5" Mar 19 15:36:03 crc kubenswrapper[4771]: I0319 15:36:03.980518 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96edfd66-abb4-4935-adcf-22c80205e7c2-dns-svc\") pod \"96edfd66-abb4-4935-adcf-22c80205e7c2\" (UID: \"96edfd66-abb4-4935-adcf-22c80205e7c2\") " Mar 19 15:36:03 crc kubenswrapper[4771]: I0319 15:36:03.980563 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cpr5\" (UniqueName: \"kubernetes.io/projected/96edfd66-abb4-4935-adcf-22c80205e7c2-kube-api-access-9cpr5\") pod \"96edfd66-abb4-4935-adcf-22c80205e7c2\" (UID: \"96edfd66-abb4-4935-adcf-22c80205e7c2\") " Mar 19 15:36:03 crc kubenswrapper[4771]: I0319 15:36:03.980606 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/96edfd66-abb4-4935-adcf-22c80205e7c2-ovsdbserver-sb\") pod \"96edfd66-abb4-4935-adcf-22c80205e7c2\" (UID: \"96edfd66-abb4-4935-adcf-22c80205e7c2\") " Mar 19 15:36:03 crc kubenswrapper[4771]: I0319 15:36:03.980631 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/96edfd66-abb4-4935-adcf-22c80205e7c2-ovsdbserver-nb\") pod \"96edfd66-abb4-4935-adcf-22c80205e7c2\" (UID: \"96edfd66-abb4-4935-adcf-22c80205e7c2\") " Mar 19 15:36:03 crc kubenswrapper[4771]: I0319 15:36:03.980687 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96edfd66-abb4-4935-adcf-22c80205e7c2-config\") pod \"96edfd66-abb4-4935-adcf-22c80205e7c2\" (UID: \"96edfd66-abb4-4935-adcf-22c80205e7c2\") " Mar 19 15:36:04 crc kubenswrapper[4771]: I0319 15:36:03.999690 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96edfd66-abb4-4935-adcf-22c80205e7c2-kube-api-access-9cpr5" (OuterVolumeSpecName: "kube-api-access-9cpr5") pod "96edfd66-abb4-4935-adcf-22c80205e7c2" (UID: "96edfd66-abb4-4935-adcf-22c80205e7c2"). InnerVolumeSpecName "kube-api-access-9cpr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:36:04 crc kubenswrapper[4771]: I0319 15:36:04.025017 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96edfd66-abb4-4935-adcf-22c80205e7c2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "96edfd66-abb4-4935-adcf-22c80205e7c2" (UID: "96edfd66-abb4-4935-adcf-22c80205e7c2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:36:04 crc kubenswrapper[4771]: I0319 15:36:04.027078 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96edfd66-abb4-4935-adcf-22c80205e7c2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "96edfd66-abb4-4935-adcf-22c80205e7c2" (UID: "96edfd66-abb4-4935-adcf-22c80205e7c2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:36:04 crc kubenswrapper[4771]: I0319 15:36:04.034777 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96edfd66-abb4-4935-adcf-22c80205e7c2-config" (OuterVolumeSpecName: "config") pod "96edfd66-abb4-4935-adcf-22c80205e7c2" (UID: "96edfd66-abb4-4935-adcf-22c80205e7c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:36:04 crc kubenswrapper[4771]: I0319 15:36:04.051861 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96edfd66-abb4-4935-adcf-22c80205e7c2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "96edfd66-abb4-4935-adcf-22c80205e7c2" (UID: "96edfd66-abb4-4935-adcf-22c80205e7c2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:36:04 crc kubenswrapper[4771]: I0319 15:36:04.082449 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96edfd66-abb4-4935-adcf-22c80205e7c2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 15:36:04 crc kubenswrapper[4771]: I0319 15:36:04.082485 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cpr5\" (UniqueName: \"kubernetes.io/projected/96edfd66-abb4-4935-adcf-22c80205e7c2-kube-api-access-9cpr5\") on node \"crc\" DevicePath \"\"" Mar 19 15:36:04 crc kubenswrapper[4771]: I0319 15:36:04.082498 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/96edfd66-abb4-4935-adcf-22c80205e7c2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 15:36:04 crc kubenswrapper[4771]: I0319 15:36:04.082509 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/96edfd66-abb4-4935-adcf-22c80205e7c2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 15:36:04 crc kubenswrapper[4771]: I0319 15:36:04.082520 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96edfd66-abb4-4935-adcf-22c80205e7c2-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:36:04 crc kubenswrapper[4771]: I0319 15:36:04.664762 4771 generic.go:334] "Generic (PLEG): container finished" podID="523fdd36-adcb-4bde-9ba4-d0f5a9560caa" containerID="918da1980407aa4504720ade6b61ebfc02466296fe98aed6dfcced04a6dbca69" exitCode=0 Mar 19 15:36:04 crc kubenswrapper[4771]: I0319 15:36:04.664832 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565576-4rhn7" event={"ID":"523fdd36-adcb-4bde-9ba4-d0f5a9560caa","Type":"ContainerDied","Data":"918da1980407aa4504720ade6b61ebfc02466296fe98aed6dfcced04a6dbca69"} Mar 19 15:36:04 crc kubenswrapper[4771]: I0319 15:36:04.667444 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-p67w5" Mar 19 15:36:04 crc kubenswrapper[4771]: I0319 15:36:04.668420 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-p67w5" event={"ID":"96edfd66-abb4-4935-adcf-22c80205e7c2","Type":"ContainerDied","Data":"2a1605b8dde876a4c894a40e6499c87509299d74d90ba003a69e757f0816ee8c"} Mar 19 15:36:04 crc kubenswrapper[4771]: I0319 15:36:04.668501 4771 scope.go:117] "RemoveContainer" containerID="c1957c063a8828a1191cca0264d683874c10c5562f461010bbde789460c7963c" Mar 19 15:36:04 crc kubenswrapper[4771]: I0319 15:36:04.710488 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-p67w5"] Mar 19 15:36:04 crc kubenswrapper[4771]: I0319 15:36:04.715130 4771 scope.go:117] "RemoveContainer" containerID="b01204d15c6fd22ef3be6aa795baa0db56c6423b963e3d8a8bc2cd48f076bacc" Mar 19 15:36:04 crc kubenswrapper[4771]: I0319 15:36:04.718682 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-p67w5"] Mar 19 15:36:04 crc kubenswrapper[4771]: I0319 15:36:04.972937 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-275z8" Mar 19 15:36:05 crc kubenswrapper[4771]: I0319 15:36:05.098139 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a0387e9-2daa-4556-9745-450c0e3015dd-operator-scripts\") pod \"7a0387e9-2daa-4556-9745-450c0e3015dd\" (UID: \"7a0387e9-2daa-4556-9745-450c0e3015dd\") " Mar 19 15:36:05 crc kubenswrapper[4771]: I0319 15:36:05.098192 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvbsv\" (UniqueName: \"kubernetes.io/projected/7a0387e9-2daa-4556-9745-450c0e3015dd-kube-api-access-cvbsv\") pod \"7a0387e9-2daa-4556-9745-450c0e3015dd\" (UID: \"7a0387e9-2daa-4556-9745-450c0e3015dd\") " Mar 19 15:36:05 crc kubenswrapper[4771]: I0319 15:36:05.098537 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a0387e9-2daa-4556-9745-450c0e3015dd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7a0387e9-2daa-4556-9745-450c0e3015dd" (UID: "7a0387e9-2daa-4556-9745-450c0e3015dd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:36:05 crc kubenswrapper[4771]: I0319 15:36:05.103963 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a0387e9-2daa-4556-9745-450c0e3015dd-kube-api-access-cvbsv" (OuterVolumeSpecName: "kube-api-access-cvbsv") pod "7a0387e9-2daa-4556-9745-450c0e3015dd" (UID: "7a0387e9-2daa-4556-9745-450c0e3015dd"). InnerVolumeSpecName "kube-api-access-cvbsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:36:05 crc kubenswrapper[4771]: I0319 15:36:05.199633 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a0387e9-2daa-4556-9745-450c0e3015dd-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 15:36:05 crc kubenswrapper[4771]: I0319 15:36:05.199673 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvbsv\" (UniqueName: \"kubernetes.io/projected/7a0387e9-2daa-4556-9745-450c0e3015dd-kube-api-access-cvbsv\") on node \"crc\" DevicePath \"\"" Mar 19 15:36:05 crc kubenswrapper[4771]: I0319 15:36:05.522129 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96edfd66-abb4-4935-adcf-22c80205e7c2" path="/var/lib/kubelet/pods/96edfd66-abb4-4935-adcf-22c80205e7c2/volumes" Mar 19 15:36:05 crc kubenswrapper[4771]: I0319 15:36:05.679833 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-275z8" event={"ID":"7a0387e9-2daa-4556-9745-450c0e3015dd","Type":"ContainerDied","Data":"75cb80b3821d645194181953cef2186e244f9d21bdd004de5264ddacaba9fe7c"} Mar 19 15:36:05 crc kubenswrapper[4771]: I0319 15:36:05.679865 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-275z8" Mar 19 15:36:05 crc kubenswrapper[4771]: I0319 15:36:05.679906 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75cb80b3821d645194181953cef2186e244f9d21bdd004de5264ddacaba9fe7c" Mar 19 15:36:05 crc kubenswrapper[4771]: I0319 15:36:05.682550 4771 generic.go:334] "Generic (PLEG): container finished" podID="04404b99-3d2c-42a9-8137-f9f7991d2d06" containerID="134c2c9bc05f58f8fe7f26996e5f4fb6c536a57fbcb2e4ab153ee21931c8bdd3" exitCode=0 Mar 19 15:36:05 crc kubenswrapper[4771]: I0319 15:36:05.682661 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mwcmg" event={"ID":"04404b99-3d2c-42a9-8137-f9f7991d2d06","Type":"ContainerDied","Data":"134c2c9bc05f58f8fe7f26996e5f4fb6c536a57fbcb2e4ab153ee21931c8bdd3"} Mar 19 15:36:05 crc kubenswrapper[4771]: I0319 15:36:05.988739 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565576-4rhn7" Mar 19 15:36:06 crc kubenswrapper[4771]: I0319 15:36:06.122261 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8sx2\" (UniqueName: \"kubernetes.io/projected/523fdd36-adcb-4bde-9ba4-d0f5a9560caa-kube-api-access-k8sx2\") pod \"523fdd36-adcb-4bde-9ba4-d0f5a9560caa\" (UID: \"523fdd36-adcb-4bde-9ba4-d0f5a9560caa\") " Mar 19 15:36:06 crc kubenswrapper[4771]: I0319 15:36:06.127158 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/523fdd36-adcb-4bde-9ba4-d0f5a9560caa-kube-api-access-k8sx2" (OuterVolumeSpecName: "kube-api-access-k8sx2") pod "523fdd36-adcb-4bde-9ba4-d0f5a9560caa" (UID: "523fdd36-adcb-4bde-9ba4-d0f5a9560caa"). InnerVolumeSpecName "kube-api-access-k8sx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:36:06 crc kubenswrapper[4771]: I0319 15:36:06.225129 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8sx2\" (UniqueName: \"kubernetes.io/projected/523fdd36-adcb-4bde-9ba4-d0f5a9560caa-kube-api-access-k8sx2\") on node \"crc\" DevicePath \"\"" Mar 19 15:36:06 crc kubenswrapper[4771]: I0319 15:36:06.700911 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565576-4rhn7" event={"ID":"523fdd36-adcb-4bde-9ba4-d0f5a9560caa","Type":"ContainerDied","Data":"235326203ccb1a370fb0a446027126e543f30351cda54fa358422755bc1d0520"} Mar 19 15:36:06 crc kubenswrapper[4771]: I0319 15:36:06.701017 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="235326203ccb1a370fb0a446027126e543f30351cda54fa358422755bc1d0520" Mar 19 15:36:06 crc kubenswrapper[4771]: I0319 15:36:06.701045 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565576-4rhn7" Mar 19 15:36:06 crc kubenswrapper[4771]: I0319 15:36:06.727771 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565570-jfqpb"] Mar 19 15:36:06 crc kubenswrapper[4771]: I0319 15:36:06.734662 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565570-jfqpb"] Mar 19 15:36:07 crc kubenswrapper[4771]: I0319 15:36:07.068330 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mwcmg" Mar 19 15:36:07 crc kubenswrapper[4771]: I0319 15:36:07.139351 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgjzr\" (UniqueName: \"kubernetes.io/projected/04404b99-3d2c-42a9-8137-f9f7991d2d06-kube-api-access-dgjzr\") pod \"04404b99-3d2c-42a9-8137-f9f7991d2d06\" (UID: \"04404b99-3d2c-42a9-8137-f9f7991d2d06\") " Mar 19 15:36:07 crc kubenswrapper[4771]: I0319 15:36:07.139493 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04404b99-3d2c-42a9-8137-f9f7991d2d06-config-data\") pod \"04404b99-3d2c-42a9-8137-f9f7991d2d06\" (UID: \"04404b99-3d2c-42a9-8137-f9f7991d2d06\") " Mar 19 15:36:07 crc kubenswrapper[4771]: I0319 15:36:07.139532 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/04404b99-3d2c-42a9-8137-f9f7991d2d06-db-sync-config-data\") pod \"04404b99-3d2c-42a9-8137-f9f7991d2d06\" (UID: \"04404b99-3d2c-42a9-8137-f9f7991d2d06\") " Mar 19 15:36:07 crc kubenswrapper[4771]: I0319 15:36:07.139678 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04404b99-3d2c-42a9-8137-f9f7991d2d06-combined-ca-bundle\") pod \"04404b99-3d2c-42a9-8137-f9f7991d2d06\" (UID: \"04404b99-3d2c-42a9-8137-f9f7991d2d06\") " Mar 19 15:36:07 crc kubenswrapper[4771]: I0319 15:36:07.144423 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04404b99-3d2c-42a9-8137-f9f7991d2d06-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "04404b99-3d2c-42a9-8137-f9f7991d2d06" (UID: "04404b99-3d2c-42a9-8137-f9f7991d2d06"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:36:07 crc kubenswrapper[4771]: I0319 15:36:07.145755 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04404b99-3d2c-42a9-8137-f9f7991d2d06-kube-api-access-dgjzr" (OuterVolumeSpecName: "kube-api-access-dgjzr") pod "04404b99-3d2c-42a9-8137-f9f7991d2d06" (UID: "04404b99-3d2c-42a9-8137-f9f7991d2d06"). InnerVolumeSpecName "kube-api-access-dgjzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:36:07 crc kubenswrapper[4771]: I0319 15:36:07.179368 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04404b99-3d2c-42a9-8137-f9f7991d2d06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04404b99-3d2c-42a9-8137-f9f7991d2d06" (UID: "04404b99-3d2c-42a9-8137-f9f7991d2d06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:36:07 crc kubenswrapper[4771]: I0319 15:36:07.191779 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04404b99-3d2c-42a9-8137-f9f7991d2d06-config-data" (OuterVolumeSpecName: "config-data") pod "04404b99-3d2c-42a9-8137-f9f7991d2d06" (UID: "04404b99-3d2c-42a9-8137-f9f7991d2d06"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:36:07 crc kubenswrapper[4771]: I0319 15:36:07.242302 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04404b99-3d2c-42a9-8137-f9f7991d2d06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 15:36:07 crc kubenswrapper[4771]: I0319 15:36:07.242338 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgjzr\" (UniqueName: \"kubernetes.io/projected/04404b99-3d2c-42a9-8137-f9f7991d2d06-kube-api-access-dgjzr\") on node \"crc\" DevicePath \"\"" Mar 19 15:36:07 crc kubenswrapper[4771]: I0319 15:36:07.242352 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04404b99-3d2c-42a9-8137-f9f7991d2d06-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 15:36:07 crc kubenswrapper[4771]: I0319 15:36:07.242363 4771 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/04404b99-3d2c-42a9-8137-f9f7991d2d06-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 15:36:07 crc kubenswrapper[4771]: I0319 15:36:07.520520 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5faa2b3-fd6c-4380-b2a6-70749e033b35" path="/var/lib/kubelet/pods/a5faa2b3-fd6c-4380-b2a6-70749e033b35/volumes" Mar 19 15:36:07 crc kubenswrapper[4771]: I0319 15:36:07.716734 4771 generic.go:334] "Generic (PLEG): container finished" podID="c065c328-37e2-4905-9d1e-82208eab196e" containerID="2d902a4936e82a188e94b40a59bd5bd8dcecd25c29a3d32b7128d5438cccb48e" exitCode=0 Mar 19 15:36:07 crc kubenswrapper[4771]: I0319 15:36:07.716839 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c065c328-37e2-4905-9d1e-82208eab196e","Type":"ContainerDied","Data":"2d902a4936e82a188e94b40a59bd5bd8dcecd25c29a3d32b7128d5438cccb48e"} Mar 19 15:36:07 crc kubenswrapper[4771]: I0319 15:36:07.716879 4771 scope.go:117] "RemoveContainer" containerID="8f6a3c52d28bf57cc459d662dd200afc3fe2897a7b69316267619435cbdac8b9" Mar 19 15:36:07 crc kubenswrapper[4771]: I0319 15:36:07.717610 4771 scope.go:117] "RemoveContainer" containerID="2d902a4936e82a188e94b40a59bd5bd8dcecd25c29a3d32b7128d5438cccb48e" Mar 19 15:36:07 crc kubenswrapper[4771]: E0319 15:36:07.717851 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 20s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:36:07 crc kubenswrapper[4771]: I0319 15:36:07.736156 4771 generic.go:334] "Generic (PLEG): container finished" podID="74c5f622-0ced-47f9-80d5-75a09acfafc0" containerID="6ae743143ba69d6b0b4a45930cf67ab154bf3857bc3a2b913bc411fef0b0bb62" exitCode=0 Mar 19 15:36:07 crc kubenswrapper[4771]: I0319 15:36:07.736351 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74c5f622-0ced-47f9-80d5-75a09acfafc0","Type":"ContainerDied","Data":"6ae743143ba69d6b0b4a45930cf67ab154bf3857bc3a2b913bc411fef0b0bb62"} Mar 19 15:36:07 crc kubenswrapper[4771]: I0319 15:36:07.737920 4771 scope.go:117] "RemoveContainer" containerID="6ae743143ba69d6b0b4a45930cf67ab154bf3857bc3a2b913bc411fef0b0bb62" Mar 19 15:36:07 crc kubenswrapper[4771]: E0319 15:36:07.749601 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 20s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:36:07 crc kubenswrapper[4771]: I0319 15:36:07.754748 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mwcmg" event={"ID":"04404b99-3d2c-42a9-8137-f9f7991d2d06","Type":"ContainerDied","Data":"46d2f0639f332086eff706438ac5cdfd488d36e9d26219e69db8f95d90d40814"} Mar 19 15:36:07 crc kubenswrapper[4771]: I0319 15:36:07.754799 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46d2f0639f332086eff706438ac5cdfd488d36e9d26219e69db8f95d90d40814" Mar 19 15:36:07 crc kubenswrapper[4771]: I0319 15:36:07.754872 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mwcmg" Mar 19 15:36:07 crc kubenswrapper[4771]: I0319 15:36:07.775453 4771 scope.go:117] "RemoveContainer" containerID="82f408c9a22bfb4646611de514397e4072fd17ead089d0bd1dc250f8a4865cb2" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.023160 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-hpc64"] Mar 19 15:36:08 crc kubenswrapper[4771]: E0319 15:36:08.023833 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96edfd66-abb4-4935-adcf-22c80205e7c2" containerName="dnsmasq-dns" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.023853 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="96edfd66-abb4-4935-adcf-22c80205e7c2" containerName="dnsmasq-dns" Mar 19 15:36:08 crc kubenswrapper[4771]: E0319 15:36:08.023876 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a0387e9-2daa-4556-9745-450c0e3015dd" containerName="mariadb-account-create-update" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.023886 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a0387e9-2daa-4556-9745-450c0e3015dd" containerName="mariadb-account-create-update" Mar 19 15:36:08 crc kubenswrapper[4771]: E0319 15:36:08.023911 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96edfd66-abb4-4935-adcf-22c80205e7c2" containerName="init" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.023919 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="96edfd66-abb4-4935-adcf-22c80205e7c2" containerName="init" Mar 19 15:36:08 crc kubenswrapper[4771]: E0319 15:36:08.023936 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04404b99-3d2c-42a9-8137-f9f7991d2d06" containerName="glance-db-sync" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.023944 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="04404b99-3d2c-42a9-8137-f9f7991d2d06" containerName="glance-db-sync" Mar 19 15:36:08 crc kubenswrapper[4771]: E0319 15:36:08.023957 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="523fdd36-adcb-4bde-9ba4-d0f5a9560caa" containerName="oc" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.023964 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="523fdd36-adcb-4bde-9ba4-d0f5a9560caa" containerName="oc" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.024138 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="04404b99-3d2c-42a9-8137-f9f7991d2d06" containerName="glance-db-sync" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.024153 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a0387e9-2daa-4556-9745-450c0e3015dd" containerName="mariadb-account-create-update" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.024160 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="96edfd66-abb4-4935-adcf-22c80205e7c2" containerName="dnsmasq-dns" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.024178 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="523fdd36-adcb-4bde-9ba4-d0f5a9560caa" containerName="oc" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.025047 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-hpc64" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.048511 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-hpc64"] Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.158940 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wpmr\" (UniqueName: \"kubernetes.io/projected/ac02ff02-302b-4ee5-98d2-59153e9f8d48-kube-api-access-8wpmr\") pod \"dnsmasq-dns-7ff5475cc9-hpc64\" (UID: \"ac02ff02-302b-4ee5-98d2-59153e9f8d48\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hpc64" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.159078 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac02ff02-302b-4ee5-98d2-59153e9f8d48-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-hpc64\" (UID: \"ac02ff02-302b-4ee5-98d2-59153e9f8d48\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hpc64" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.159130 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac02ff02-302b-4ee5-98d2-59153e9f8d48-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-hpc64\" (UID: \"ac02ff02-302b-4ee5-98d2-59153e9f8d48\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hpc64" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.159156 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac02ff02-302b-4ee5-98d2-59153e9f8d48-config\") pod \"dnsmasq-dns-7ff5475cc9-hpc64\" (UID: \"ac02ff02-302b-4ee5-98d2-59153e9f8d48\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hpc64" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.159206 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac02ff02-302b-4ee5-98d2-59153e9f8d48-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-hpc64\" (UID: \"ac02ff02-302b-4ee5-98d2-59153e9f8d48\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hpc64" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.159244 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac02ff02-302b-4ee5-98d2-59153e9f8d48-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-hpc64\" (UID: \"ac02ff02-302b-4ee5-98d2-59153e9f8d48\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hpc64" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.260328 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac02ff02-302b-4ee5-98d2-59153e9f8d48-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-hpc64\" (UID: \"ac02ff02-302b-4ee5-98d2-59153e9f8d48\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hpc64" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.260649 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac02ff02-302b-4ee5-98d2-59153e9f8d48-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-hpc64\" (UID: \"ac02ff02-302b-4ee5-98d2-59153e9f8d48\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hpc64" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.260710 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wpmr\" (UniqueName: \"kubernetes.io/projected/ac02ff02-302b-4ee5-98d2-59153e9f8d48-kube-api-access-8wpmr\") pod \"dnsmasq-dns-7ff5475cc9-hpc64\" (UID: \"ac02ff02-302b-4ee5-98d2-59153e9f8d48\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hpc64" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.260790 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac02ff02-302b-4ee5-98d2-59153e9f8d48-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-hpc64\" (UID: \"ac02ff02-302b-4ee5-98d2-59153e9f8d48\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hpc64" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.260855 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac02ff02-302b-4ee5-98d2-59153e9f8d48-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-hpc64\" (UID: \"ac02ff02-302b-4ee5-98d2-59153e9f8d48\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hpc64" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.260886 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac02ff02-302b-4ee5-98d2-59153e9f8d48-config\") pod \"dnsmasq-dns-7ff5475cc9-hpc64\" (UID: \"ac02ff02-302b-4ee5-98d2-59153e9f8d48\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hpc64" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.261226 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac02ff02-302b-4ee5-98d2-59153e9f8d48-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-hpc64\" (UID: \"ac02ff02-302b-4ee5-98d2-59153e9f8d48\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hpc64" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.261499 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac02ff02-302b-4ee5-98d2-59153e9f8d48-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-hpc64\" (UID: \"ac02ff02-302b-4ee5-98d2-59153e9f8d48\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hpc64" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.261862 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac02ff02-302b-4ee5-98d2-59153e9f8d48-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-hpc64\" (UID: \"ac02ff02-302b-4ee5-98d2-59153e9f8d48\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hpc64" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.261873 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac02ff02-302b-4ee5-98d2-59153e9f8d48-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-hpc64\" (UID: \"ac02ff02-302b-4ee5-98d2-59153e9f8d48\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hpc64" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.262236 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac02ff02-302b-4ee5-98d2-59153e9f8d48-config\") pod \"dnsmasq-dns-7ff5475cc9-hpc64\" (UID: \"ac02ff02-302b-4ee5-98d2-59153e9f8d48\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hpc64" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.280914 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wpmr\" (UniqueName: \"kubernetes.io/projected/ac02ff02-302b-4ee5-98d2-59153e9f8d48-kube-api-access-8wpmr\") pod \"dnsmasq-dns-7ff5475cc9-hpc64\" (UID: \"ac02ff02-302b-4ee5-98d2-59153e9f8d48\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hpc64" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.354088 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-hpc64" Mar 19 15:36:08 crc kubenswrapper[4771]: I0319 15:36:08.805510 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-hpc64"] Mar 19 15:36:08 crc kubenswrapper[4771]: W0319 15:36:08.818674 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac02ff02_302b_4ee5_98d2_59153e9f8d48.slice/crio-1fea349ef1f10f2c7e259d6a8dea3f2d05a860099217c52daa3967fe14efc009 WatchSource:0}: Error finding container 1fea349ef1f10f2c7e259d6a8dea3f2d05a860099217c52daa3967fe14efc009: Status 404 returned error can't find the container with id 1fea349ef1f10f2c7e259d6a8dea3f2d05a860099217c52daa3967fe14efc009 Mar 19 15:36:09 crc kubenswrapper[4771]: I0319 15:36:09.782463 4771 generic.go:334] "Generic (PLEG): container finished" podID="ac02ff02-302b-4ee5-98d2-59153e9f8d48" containerID="c97a6210ebc6b6cb0464f446bbd062d8c3cadae84f294d7cd59ddf259cff04a3" exitCode=0 Mar 19 15:36:09 crc kubenswrapper[4771]: I0319 15:36:09.782516 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-hpc64" event={"ID":"ac02ff02-302b-4ee5-98d2-59153e9f8d48","Type":"ContainerDied","Data":"c97a6210ebc6b6cb0464f446bbd062d8c3cadae84f294d7cd59ddf259cff04a3"} Mar 19 15:36:09 crc kubenswrapper[4771]: I0319 15:36:09.782775 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-hpc64" event={"ID":"ac02ff02-302b-4ee5-98d2-59153e9f8d48","Type":"ContainerStarted","Data":"1fea349ef1f10f2c7e259d6a8dea3f2d05a860099217c52daa3967fe14efc009"} Mar 19 15:36:10 crc kubenswrapper[4771]: I0319 15:36:10.792160 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-hpc64" event={"ID":"ac02ff02-302b-4ee5-98d2-59153e9f8d48","Type":"ContainerStarted","Data":"14ba7a706af0cbabfb72350c32f63187ee6a541f9a255de427510da2b7f59d4a"} Mar 19 15:36:10 crc kubenswrapper[4771]: I0319 15:36:10.792540 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7ff5475cc9-hpc64" Mar 19 15:36:10 crc kubenswrapper[4771]: I0319 15:36:10.818142 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7ff5475cc9-hpc64" podStartSLOduration=3.818118217 podStartE2EDuration="3.818118217s" podCreationTimestamp="2026-03-19 15:36:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 15:36:10.811398876 +0000 UTC m=+1230.040020078" watchObservedRunningTime="2026-03-19 15:36:10.818118217 +0000 UTC m=+1230.046739419" Mar 19 15:36:18 crc kubenswrapper[4771]: I0319 15:36:18.356230 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7ff5475cc9-hpc64" Mar 19 15:36:18 crc kubenswrapper[4771]: I0319 15:36:18.428170 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-bktbk"] Mar 19 15:36:18 crc kubenswrapper[4771]: I0319 15:36:18.428483 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" podUID="d5517f09-737f-4a05-8293-d646d035c4ea" containerName="dnsmasq-dns" containerID="cri-o://1c02f0a334d1c859ae293acb2f35e7f9444215760af8f44efe70337cb9a936dc" gracePeriod=10 Mar 19 15:36:18 crc kubenswrapper[4771]: I0319 15:36:18.864643 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" event={"ID":"d5517f09-737f-4a05-8293-d646d035c4ea","Type":"ContainerDied","Data":"1c02f0a334d1c859ae293acb2f35e7f9444215760af8f44efe70337cb9a936dc"} Mar 19 15:36:18 crc kubenswrapper[4771]: I0319 15:36:18.864400 4771 generic.go:334] "Generic (PLEG): container finished" podID="d5517f09-737f-4a05-8293-d646d035c4ea" containerID="1c02f0a334d1c859ae293acb2f35e7f9444215760af8f44efe70337cb9a936dc" exitCode=0 Mar 19 15:36:18 crc kubenswrapper[4771]: I0319 15:36:18.871250 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" event={"ID":"d5517f09-737f-4a05-8293-d646d035c4ea","Type":"ContainerDied","Data":"83e5a3e667096ddb9c0c0f8031dcc3b97fa310968e53900f9fc3f52b40a53529"} Mar 19 15:36:18 crc kubenswrapper[4771]: I0319 15:36:18.871303 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83e5a3e667096ddb9c0c0f8031dcc3b97fa310968e53900f9fc3f52b40a53529" Mar 19 15:36:18 crc kubenswrapper[4771]: I0319 15:36:18.905115 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" Mar 19 15:36:19 crc kubenswrapper[4771]: I0319 15:36:19.057697 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-config\") pod \"d5517f09-737f-4a05-8293-d646d035c4ea\" (UID: \"d5517f09-737f-4a05-8293-d646d035c4ea\") " Mar 19 15:36:19 crc kubenswrapper[4771]: I0319 15:36:19.058163 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wplhc\" (UniqueName: \"kubernetes.io/projected/d5517f09-737f-4a05-8293-d646d035c4ea-kube-api-access-wplhc\") pod \"d5517f09-737f-4a05-8293-d646d035c4ea\" (UID: \"d5517f09-737f-4a05-8293-d646d035c4ea\") " Mar 19 15:36:19 crc kubenswrapper[4771]: I0319 15:36:19.058205 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-ovsdbserver-nb\") pod \"d5517f09-737f-4a05-8293-d646d035c4ea\" (UID: \"d5517f09-737f-4a05-8293-d646d035c4ea\") " Mar 19 15:36:19 crc kubenswrapper[4771]: I0319 15:36:19.058288 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-dns-swift-storage-0\") pod \"d5517f09-737f-4a05-8293-d646d035c4ea\" (UID: \"d5517f09-737f-4a05-8293-d646d035c4ea\") " Mar 19 15:36:19 crc kubenswrapper[4771]: I0319 15:36:19.058315 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-ovsdbserver-sb\") pod \"d5517f09-737f-4a05-8293-d646d035c4ea\" (UID: \"d5517f09-737f-4a05-8293-d646d035c4ea\") " Mar 19 15:36:19 crc kubenswrapper[4771]: I0319 15:36:19.058348 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-dns-svc\") pod \"d5517f09-737f-4a05-8293-d646d035c4ea\" (UID: \"d5517f09-737f-4a05-8293-d646d035c4ea\") " Mar 19 15:36:19 crc kubenswrapper[4771]: I0319 15:36:19.064208 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5517f09-737f-4a05-8293-d646d035c4ea-kube-api-access-wplhc" (OuterVolumeSpecName: "kube-api-access-wplhc") pod "d5517f09-737f-4a05-8293-d646d035c4ea" (UID: "d5517f09-737f-4a05-8293-d646d035c4ea"). InnerVolumeSpecName "kube-api-access-wplhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:36:19 crc kubenswrapper[4771]: I0319 15:36:19.099007 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-config" (OuterVolumeSpecName: "config") pod "d5517f09-737f-4a05-8293-d646d035c4ea" (UID: "d5517f09-737f-4a05-8293-d646d035c4ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:36:19 crc kubenswrapper[4771]: I0319 15:36:19.099067 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d5517f09-737f-4a05-8293-d646d035c4ea" (UID: "d5517f09-737f-4a05-8293-d646d035c4ea"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:36:19 crc kubenswrapper[4771]: I0319 15:36:19.103898 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d5517f09-737f-4a05-8293-d646d035c4ea" (UID: "d5517f09-737f-4a05-8293-d646d035c4ea"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:36:19 crc kubenswrapper[4771]: I0319 15:36:19.107637 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d5517f09-737f-4a05-8293-d646d035c4ea" (UID: "d5517f09-737f-4a05-8293-d646d035c4ea"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:36:19 crc kubenswrapper[4771]: I0319 15:36:19.109135 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d5517f09-737f-4a05-8293-d646d035c4ea" (UID: "d5517f09-737f-4a05-8293-d646d035c4ea"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:36:19 crc kubenswrapper[4771]: I0319 15:36:19.161338 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 15:36:19 crc kubenswrapper[4771]: I0319 15:36:19.161382 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 15:36:19 crc kubenswrapper[4771]: I0319 15:36:19.161394 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 15:36:19 crc kubenswrapper[4771]: I0319 15:36:19.161406 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-config\") on node \"crc\" DevicePath \"\"" Mar 19 15:36:19 crc kubenswrapper[4771]: I0319 15:36:19.161417 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wplhc\" (UniqueName: \"kubernetes.io/projected/d5517f09-737f-4a05-8293-d646d035c4ea-kube-api-access-wplhc\") on node \"crc\" DevicePath \"\"" Mar 19 15:36:19 crc kubenswrapper[4771]: I0319 15:36:19.161430 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5517f09-737f-4a05-8293-d646d035c4ea-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 15:36:19 crc kubenswrapper[4771]: I0319 15:36:19.877413 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-bktbk" Mar 19 15:36:19 crc kubenswrapper[4771]: I0319 15:36:19.896962 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-bktbk"] Mar 19 15:36:19 crc kubenswrapper[4771]: I0319 15:36:19.904849 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-bktbk"] Mar 19 15:36:20 crc kubenswrapper[4771]: I0319 15:36:20.509077 4771 scope.go:117] "RemoveContainer" containerID="2d902a4936e82a188e94b40a59bd5bd8dcecd25c29a3d32b7128d5438cccb48e" Mar 19 15:36:20 crc kubenswrapper[4771]: E0319 15:36:20.509608 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 20s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:36:21 crc kubenswrapper[4771]: I0319 15:36:21.520967 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5517f09-737f-4a05-8293-d646d035c4ea" path="/var/lib/kubelet/pods/d5517f09-737f-4a05-8293-d646d035c4ea/volumes" Mar 19 15:36:22 crc kubenswrapper[4771]: I0319 15:36:22.509542 4771 scope.go:117] "RemoveContainer" containerID="6ae743143ba69d6b0b4a45930cf67ab154bf3857bc3a2b913bc411fef0b0bb62" Mar 19 15:36:22 crc kubenswrapper[4771]: E0319 15:36:22.509869 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 20s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:36:23 crc kubenswrapper[4771]: I0319 15:36:23.027644 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:36:23 crc kubenswrapper[4771]: I0319 15:36:23.027709 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:36:23 crc kubenswrapper[4771]: I0319 15:36:23.027758 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" Mar 19 15:36:23 crc kubenswrapper[4771]: I0319 15:36:23.028429 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"953ae2c967341b87ceb085ed4fd2e6023f2aede65dc55a12a3a59811b0300199"} pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 15:36:23 crc kubenswrapper[4771]: I0319 15:36:23.028493 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" containerID="cri-o://953ae2c967341b87ceb085ed4fd2e6023f2aede65dc55a12a3a59811b0300199" gracePeriod=600 Mar 19 15:36:23 crc kubenswrapper[4771]: I0319 15:36:23.912581 4771 generic.go:334] "Generic (PLEG): container finished" podID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerID="953ae2c967341b87ceb085ed4fd2e6023f2aede65dc55a12a3a59811b0300199" exitCode=0 Mar 19 15:36:23 crc kubenswrapper[4771]: I0319 15:36:23.912657 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" event={"ID":"f2b6e948-bbef-4217-b0eb-4cdbf711037c","Type":"ContainerDied","Data":"953ae2c967341b87ceb085ed4fd2e6023f2aede65dc55a12a3a59811b0300199"} Mar 19 15:36:23 crc kubenswrapper[4771]: I0319 15:36:23.913290 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" event={"ID":"f2b6e948-bbef-4217-b0eb-4cdbf711037c","Type":"ContainerStarted","Data":"9a7f9aaa7059458b50170d0fb5711aac81f8c848fea74597bd5d13bc3f12edd3"} Mar 19 15:36:23 crc kubenswrapper[4771]: I0319 15:36:23.913316 4771 scope.go:117] "RemoveContainer" containerID="ed5a553c2d92c915ce47410116c0fc185162ea3ab77f14a7e2453e14985c8a40" Mar 19 15:36:33 crc kubenswrapper[4771]: I0319 15:36:33.508686 4771 scope.go:117] "RemoveContainer" containerID="2d902a4936e82a188e94b40a59bd5bd8dcecd25c29a3d32b7128d5438cccb48e" Mar 19 15:36:33 crc kubenswrapper[4771]: I0319 15:36:33.509461 4771 scope.go:117] "RemoveContainer" containerID="6ae743143ba69d6b0b4a45930cf67ab154bf3857bc3a2b913bc411fef0b0bb62" Mar 19 15:36:34 crc kubenswrapper[4771]: I0319 15:36:34.009433 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c065c328-37e2-4905-9d1e-82208eab196e","Type":"ContainerStarted","Data":"f71507522606fb47923b846f0559d8def5e0dc9cdbf6c487d296551b18be43f3"} Mar 19 15:36:34 crc kubenswrapper[4771]: I0319 15:36:34.009964 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:36:34 crc kubenswrapper[4771]: I0319 15:36:34.012607 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74c5f622-0ced-47f9-80d5-75a09acfafc0","Type":"ContainerStarted","Data":"83cf59b2e554c8cb2babdff4544fcbbed7f828601df7093f61d45ae1551b4c15"} Mar 19 15:36:34 crc kubenswrapper[4771]: I0319 15:36:34.012890 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 19 15:36:39 crc kubenswrapper[4771]: I0319 15:36:39.061558 4771 generic.go:334] "Generic (PLEG): container finished" podID="c065c328-37e2-4905-9d1e-82208eab196e" containerID="f71507522606fb47923b846f0559d8def5e0dc9cdbf6c487d296551b18be43f3" exitCode=0 Mar 19 15:36:39 crc kubenswrapper[4771]: I0319 15:36:39.061653 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c065c328-37e2-4905-9d1e-82208eab196e","Type":"ContainerDied","Data":"f71507522606fb47923b846f0559d8def5e0dc9cdbf6c487d296551b18be43f3"} Mar 19 15:36:39 crc kubenswrapper[4771]: I0319 15:36:39.062138 4771 scope.go:117] "RemoveContainer" containerID="2d902a4936e82a188e94b40a59bd5bd8dcecd25c29a3d32b7128d5438cccb48e" Mar 19 15:36:39 crc kubenswrapper[4771]: I0319 15:36:39.062827 4771 scope.go:117] "RemoveContainer" containerID="f71507522606fb47923b846f0559d8def5e0dc9cdbf6c487d296551b18be43f3" Mar 19 15:36:39 crc kubenswrapper[4771]: E0319 15:36:39.063287 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:36:40 crc kubenswrapper[4771]: I0319 15:36:40.075810 4771 generic.go:334] "Generic (PLEG): container finished" podID="74c5f622-0ced-47f9-80d5-75a09acfafc0" containerID="83cf59b2e554c8cb2babdff4544fcbbed7f828601df7093f61d45ae1551b4c15" exitCode=0 Mar 19 15:36:40 crc kubenswrapper[4771]: I0319 15:36:40.075877 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74c5f622-0ced-47f9-80d5-75a09acfafc0","Type":"ContainerDied","Data":"83cf59b2e554c8cb2babdff4544fcbbed7f828601df7093f61d45ae1551b4c15"} Mar 19 15:36:40 crc kubenswrapper[4771]: I0319 15:36:40.076223 4771 scope.go:117] "RemoveContainer" containerID="6ae743143ba69d6b0b4a45930cf67ab154bf3857bc3a2b913bc411fef0b0bb62" Mar 19 15:36:40 crc kubenswrapper[4771]: I0319 15:36:40.077254 4771 scope.go:117] "RemoveContainer" containerID="83cf59b2e554c8cb2babdff4544fcbbed7f828601df7093f61d45ae1551b4c15" Mar 19 15:36:40 crc kubenswrapper[4771]: E0319 15:36:40.077635 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:36:51 crc kubenswrapper[4771]: I0319 15:36:51.532737 4771 scope.go:117] "RemoveContainer" containerID="83cf59b2e554c8cb2babdff4544fcbbed7f828601df7093f61d45ae1551b4c15" Mar 19 15:36:51 crc kubenswrapper[4771]: E0319 15:36:51.534538 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:36:52 crc kubenswrapper[4771]: I0319 15:36:52.509600 4771 scope.go:117] "RemoveContainer" containerID="f71507522606fb47923b846f0559d8def5e0dc9cdbf6c487d296551b18be43f3" Mar 19 15:36:52 crc kubenswrapper[4771]: E0319 15:36:52.510472 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:36:57 crc kubenswrapper[4771]: I0319 15:36:57.001040 4771 scope.go:117] "RemoveContainer" containerID="de583627c27a36e50e86e60a2d6e637578d599acc73b71c3837fc8ca21e2d5f0" Mar 19 15:37:03 crc kubenswrapper[4771]: I0319 15:37:03.510587 4771 scope.go:117] "RemoveContainer" containerID="83cf59b2e554c8cb2babdff4544fcbbed7f828601df7093f61d45ae1551b4c15" Mar 19 15:37:03 crc kubenswrapper[4771]: E0319 15:37:03.511639 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:37:07 crc kubenswrapper[4771]: I0319 15:37:07.509011 4771 scope.go:117] "RemoveContainer" containerID="f71507522606fb47923b846f0559d8def5e0dc9cdbf6c487d296551b18be43f3" Mar 19 15:37:07 crc kubenswrapper[4771]: E0319 15:37:07.509920 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:37:17 crc kubenswrapper[4771]: I0319 15:37:17.509881 4771 scope.go:117] "RemoveContainer" containerID="83cf59b2e554c8cb2babdff4544fcbbed7f828601df7093f61d45ae1551b4c15" Mar 19 15:37:17 crc kubenswrapper[4771]: E0319 15:37:17.511460 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:37:21 crc kubenswrapper[4771]: I0319 15:37:21.518117 4771 scope.go:117] "RemoveContainer" containerID="f71507522606fb47923b846f0559d8def5e0dc9cdbf6c487d296551b18be43f3" Mar 19 15:37:22 crc kubenswrapper[4771]: I0319 15:37:22.499198 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c065c328-37e2-4905-9d1e-82208eab196e","Type":"ContainerStarted","Data":"b403cf2717f36e8851470f6fdbcc1e19ef4e3cc7642df5bd3bafb3e85709d78f"} Mar 19 15:37:22 crc kubenswrapper[4771]: I0319 15:37:22.499929 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:37:26 crc kubenswrapper[4771]: I0319 15:37:26.543320 4771 generic.go:334] "Generic (PLEG): container finished" podID="c065c328-37e2-4905-9d1e-82208eab196e" containerID="b403cf2717f36e8851470f6fdbcc1e19ef4e3cc7642df5bd3bafb3e85709d78f" exitCode=0 Mar 19 15:37:26 crc kubenswrapper[4771]: I0319 15:37:26.543454 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c065c328-37e2-4905-9d1e-82208eab196e","Type":"ContainerDied","Data":"b403cf2717f36e8851470f6fdbcc1e19ef4e3cc7642df5bd3bafb3e85709d78f"} Mar 19 15:37:26 crc kubenswrapper[4771]: I0319 15:37:26.543905 4771 scope.go:117] "RemoveContainer" containerID="f71507522606fb47923b846f0559d8def5e0dc9cdbf6c487d296551b18be43f3" Mar 19 15:37:26 crc kubenswrapper[4771]: I0319 15:37:26.544868 4771 scope.go:117] "RemoveContainer" containerID="b403cf2717f36e8851470f6fdbcc1e19ef4e3cc7642df5bd3bafb3e85709d78f" Mar 19 15:37:26 crc kubenswrapper[4771]: E0319 15:37:26.545238 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:37:31 crc kubenswrapper[4771]: I0319 15:37:31.521953 4771 scope.go:117] "RemoveContainer" containerID="83cf59b2e554c8cb2babdff4544fcbbed7f828601df7093f61d45ae1551b4c15" Mar 19 15:37:32 crc kubenswrapper[4771]: I0319 15:37:32.607905 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74c5f622-0ced-47f9-80d5-75a09acfafc0","Type":"ContainerStarted","Data":"87717d1d986a54696c6b6e88984501af1f290779afabd0f4ba366c1458b946ac"} Mar 19 15:37:32 crc kubenswrapper[4771]: I0319 15:37:32.608634 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 19 15:37:36 crc kubenswrapper[4771]: I0319 15:37:36.645612 4771 generic.go:334] "Generic (PLEG): container finished" podID="74c5f622-0ced-47f9-80d5-75a09acfafc0" containerID="87717d1d986a54696c6b6e88984501af1f290779afabd0f4ba366c1458b946ac" exitCode=0 Mar 19 15:37:36 crc kubenswrapper[4771]: I0319 15:37:36.645693 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74c5f622-0ced-47f9-80d5-75a09acfafc0","Type":"ContainerDied","Data":"87717d1d986a54696c6b6e88984501af1f290779afabd0f4ba366c1458b946ac"} Mar 19 15:37:36 crc kubenswrapper[4771]: I0319 15:37:36.646653 4771 scope.go:117] "RemoveContainer" containerID="83cf59b2e554c8cb2babdff4544fcbbed7f828601df7093f61d45ae1551b4c15" Mar 19 15:37:36 crc kubenswrapper[4771]: I0319 15:37:36.647406 4771 scope.go:117] "RemoveContainer" containerID="87717d1d986a54696c6b6e88984501af1f290779afabd0f4ba366c1458b946ac" Mar 19 15:37:36 crc kubenswrapper[4771]: E0319 15:37:36.647681 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:37:37 crc kubenswrapper[4771]: I0319 15:37:37.509178 4771 scope.go:117] "RemoveContainer" containerID="b403cf2717f36e8851470f6fdbcc1e19ef4e3cc7642df5bd3bafb3e85709d78f" Mar 19 15:37:37 crc kubenswrapper[4771]: E0319 15:37:37.509403 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:37:51 crc kubenswrapper[4771]: I0319 15:37:51.515796 4771 scope.go:117] "RemoveContainer" containerID="87717d1d986a54696c6b6e88984501af1f290779afabd0f4ba366c1458b946ac" Mar 19 15:37:51 crc kubenswrapper[4771]: E0319 15:37:51.516541 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:37:52 crc kubenswrapper[4771]: I0319 15:37:52.509325 4771 scope.go:117] "RemoveContainer" containerID="b403cf2717f36e8851470f6fdbcc1e19ef4e3cc7642df5bd3bafb3e85709d78f" Mar 19 15:37:52 crc kubenswrapper[4771]: E0319 15:37:52.509602 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:38:00 crc kubenswrapper[4771]: I0319 15:38:00.150353 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565578-bmw69"] Mar 19 15:38:00 crc kubenswrapper[4771]: E0319 15:38:00.151259 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5517f09-737f-4a05-8293-d646d035c4ea" containerName="dnsmasq-dns" Mar 19 15:38:00 crc kubenswrapper[4771]: I0319 15:38:00.151272 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5517f09-737f-4a05-8293-d646d035c4ea" containerName="dnsmasq-dns" Mar 19 15:38:00 crc kubenswrapper[4771]: E0319 15:38:00.151298 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5517f09-737f-4a05-8293-d646d035c4ea" containerName="init" Mar 19 15:38:00 crc kubenswrapper[4771]: I0319 15:38:00.151304 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5517f09-737f-4a05-8293-d646d035c4ea" containerName="init" Mar 19 15:38:00 crc kubenswrapper[4771]: I0319 15:38:00.151435 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5517f09-737f-4a05-8293-d646d035c4ea" containerName="dnsmasq-dns" Mar 19 15:38:00 crc kubenswrapper[4771]: I0319 15:38:00.151967 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565578-bmw69" Mar 19 15:38:00 crc kubenswrapper[4771]: I0319 15:38:00.155418 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k42k7" Mar 19 15:38:00 crc kubenswrapper[4771]: I0319 15:38:00.155622 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 15:38:00 crc kubenswrapper[4771]: I0319 15:38:00.155756 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 15:38:00 crc kubenswrapper[4771]: I0319 15:38:00.159142 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565578-bmw69"] Mar 19 15:38:00 crc kubenswrapper[4771]: I0319 15:38:00.246057 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxfjz\" (UniqueName: \"kubernetes.io/projected/ae9e370e-3985-4fb9-83fd-b080b1c22ca0-kube-api-access-pxfjz\") pod \"auto-csr-approver-29565578-bmw69\" (UID: \"ae9e370e-3985-4fb9-83fd-b080b1c22ca0\") " pod="openshift-infra/auto-csr-approver-29565578-bmw69" Mar 19 15:38:00 crc kubenswrapper[4771]: I0319 15:38:00.347714 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxfjz\" (UniqueName: \"kubernetes.io/projected/ae9e370e-3985-4fb9-83fd-b080b1c22ca0-kube-api-access-pxfjz\") pod \"auto-csr-approver-29565578-bmw69\" (UID: \"ae9e370e-3985-4fb9-83fd-b080b1c22ca0\") " pod="openshift-infra/auto-csr-approver-29565578-bmw69" Mar 19 15:38:00 crc kubenswrapper[4771]: I0319 15:38:00.371780 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxfjz\" (UniqueName: \"kubernetes.io/projected/ae9e370e-3985-4fb9-83fd-b080b1c22ca0-kube-api-access-pxfjz\") pod \"auto-csr-approver-29565578-bmw69\" (UID: \"ae9e370e-3985-4fb9-83fd-b080b1c22ca0\") " pod="openshift-infra/auto-csr-approver-29565578-bmw69" Mar 19 15:38:00 crc kubenswrapper[4771]: I0319 15:38:00.494252 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565578-bmw69" Mar 19 15:38:00 crc kubenswrapper[4771]: I0319 15:38:00.787282 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565578-bmw69"] Mar 19 15:38:00 crc kubenswrapper[4771]: I0319 15:38:00.798864 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 15:38:00 crc kubenswrapper[4771]: I0319 15:38:00.877865 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565578-bmw69" event={"ID":"ae9e370e-3985-4fb9-83fd-b080b1c22ca0","Type":"ContainerStarted","Data":"d10672602d1b0cbe25b2a80fe696ff3b0b2168106230a0e285f27aec35d7ccaa"} Mar 19 15:38:02 crc kubenswrapper[4771]: I0319 15:38:02.899886 4771 generic.go:334] "Generic (PLEG): container finished" podID="ae9e370e-3985-4fb9-83fd-b080b1c22ca0" containerID="81c6bd785bc62d983b852e6db2010b3f9af9371c68c00e9c7ac742ac6d30bf2e" exitCode=0 Mar 19 15:38:02 crc kubenswrapper[4771]: I0319 15:38:02.900023 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565578-bmw69" event={"ID":"ae9e370e-3985-4fb9-83fd-b080b1c22ca0","Type":"ContainerDied","Data":"81c6bd785bc62d983b852e6db2010b3f9af9371c68c00e9c7ac742ac6d30bf2e"} Mar 19 15:38:04 crc kubenswrapper[4771]: I0319 15:38:04.266059 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565578-bmw69" Mar 19 15:38:04 crc kubenswrapper[4771]: I0319 15:38:04.417156 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxfjz\" (UniqueName: \"kubernetes.io/projected/ae9e370e-3985-4fb9-83fd-b080b1c22ca0-kube-api-access-pxfjz\") pod \"ae9e370e-3985-4fb9-83fd-b080b1c22ca0\" (UID: \"ae9e370e-3985-4fb9-83fd-b080b1c22ca0\") " Mar 19 15:38:04 crc kubenswrapper[4771]: I0319 15:38:04.426762 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae9e370e-3985-4fb9-83fd-b080b1c22ca0-kube-api-access-pxfjz" (OuterVolumeSpecName: "kube-api-access-pxfjz") pod "ae9e370e-3985-4fb9-83fd-b080b1c22ca0" (UID: "ae9e370e-3985-4fb9-83fd-b080b1c22ca0"). InnerVolumeSpecName "kube-api-access-pxfjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:38:04 crc kubenswrapper[4771]: I0319 15:38:04.508698 4771 scope.go:117] "RemoveContainer" containerID="87717d1d986a54696c6b6e88984501af1f290779afabd0f4ba366c1458b946ac" Mar 19 15:38:04 crc kubenswrapper[4771]: E0319 15:38:04.509092 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:38:04 crc kubenswrapper[4771]: I0319 15:38:04.519315 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxfjz\" (UniqueName: \"kubernetes.io/projected/ae9e370e-3985-4fb9-83fd-b080b1c22ca0-kube-api-access-pxfjz\") on node \"crc\" DevicePath \"\"" Mar 19 15:38:04 crc kubenswrapper[4771]: I0319 15:38:04.922317 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565578-bmw69" event={"ID":"ae9e370e-3985-4fb9-83fd-b080b1c22ca0","Type":"ContainerDied","Data":"d10672602d1b0cbe25b2a80fe696ff3b0b2168106230a0e285f27aec35d7ccaa"} Mar 19 15:38:04 crc kubenswrapper[4771]: I0319 15:38:04.922366 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d10672602d1b0cbe25b2a80fe696ff3b0b2168106230a0e285f27aec35d7ccaa" Mar 19 15:38:04 crc kubenswrapper[4771]: I0319 15:38:04.922371 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565578-bmw69" Mar 19 15:38:05 crc kubenswrapper[4771]: I0319 15:38:05.352412 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565572-h8tvp"] Mar 19 15:38:05 crc kubenswrapper[4771]: I0319 15:38:05.358441 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565572-h8tvp"] Mar 19 15:38:05 crc kubenswrapper[4771]: I0319 15:38:05.527095 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="881b13b7-beaa-4807-a523-329fd35bb96d" path="/var/lib/kubelet/pods/881b13b7-beaa-4807-a523-329fd35bb96d/volumes" Mar 19 15:38:08 crc kubenswrapper[4771]: I0319 15:38:08.508734 4771 scope.go:117] "RemoveContainer" containerID="b403cf2717f36e8851470f6fdbcc1e19ef4e3cc7642df5bd3bafb3e85709d78f" Mar 19 15:38:08 crc kubenswrapper[4771]: E0319 15:38:08.509354 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:38:19 crc kubenswrapper[4771]: I0319 15:38:19.509350 4771 scope.go:117] "RemoveContainer" containerID="87717d1d986a54696c6b6e88984501af1f290779afabd0f4ba366c1458b946ac" Mar 19 15:38:19 crc kubenswrapper[4771]: I0319 15:38:19.510074 4771 scope.go:117] "RemoveContainer" containerID="b403cf2717f36e8851470f6fdbcc1e19ef4e3cc7642df5bd3bafb3e85709d78f" Mar 19 15:38:19 crc kubenswrapper[4771]: E0319 15:38:19.510371 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:38:19 crc kubenswrapper[4771]: E0319 15:38:19.510403 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:38:23 crc kubenswrapper[4771]: I0319 15:38:23.027402 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:38:23 crc kubenswrapper[4771]: I0319 15:38:23.027803 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:38:31 crc kubenswrapper[4771]: I0319 15:38:31.518070 4771 scope.go:117] "RemoveContainer" containerID="b403cf2717f36e8851470f6fdbcc1e19ef4e3cc7642df5bd3bafb3e85709d78f" Mar 19 15:38:31 crc kubenswrapper[4771]: E0319 15:38:31.519423 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:38:34 crc kubenswrapper[4771]: I0319 15:38:34.508723 4771 scope.go:117] "RemoveContainer" containerID="87717d1d986a54696c6b6e88984501af1f290779afabd0f4ba366c1458b946ac" Mar 19 15:38:34 crc kubenswrapper[4771]: E0319 15:38:34.509597 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:38:45 crc kubenswrapper[4771]: I0319 15:38:45.508977 4771 scope.go:117] "RemoveContainer" containerID="b403cf2717f36e8851470f6fdbcc1e19ef4e3cc7642df5bd3bafb3e85709d78f" Mar 19 15:38:45 crc kubenswrapper[4771]: E0319 15:38:45.510235 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:38:46 crc kubenswrapper[4771]: I0319 15:38:46.510080 4771 scope.go:117] "RemoveContainer" containerID="87717d1d986a54696c6b6e88984501af1f290779afabd0f4ba366c1458b946ac" Mar 19 15:38:46 crc kubenswrapper[4771]: E0319 15:38:46.510399 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:38:53 crc kubenswrapper[4771]: I0319 15:38:53.028191 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:38:53 crc kubenswrapper[4771]: I0319 15:38:53.029373 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:38:57 crc kubenswrapper[4771]: I0319 15:38:57.106386 4771 scope.go:117] "RemoveContainer" containerID="56d67677ac0ab0f0d1788bbe3b70630faeb581ca3304718c54706eeed92d801d" Mar 19 15:39:00 crc kubenswrapper[4771]: I0319 15:39:00.509331 4771 scope.go:117] "RemoveContainer" containerID="b403cf2717f36e8851470f6fdbcc1e19ef4e3cc7642df5bd3bafb3e85709d78f" Mar 19 15:39:00 crc kubenswrapper[4771]: I0319 15:39:00.510132 4771 scope.go:117] "RemoveContainer" containerID="87717d1d986a54696c6b6e88984501af1f290779afabd0f4ba366c1458b946ac" Mar 19 15:39:01 crc kubenswrapper[4771]: I0319 15:39:01.530611 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74c5f622-0ced-47f9-80d5-75a09acfafc0","Type":"ContainerStarted","Data":"944842aae1f1ed0bfdac88cfde25d0fea353437682bb62df1fc04652978c03b3"} Mar 19 15:39:01 crc kubenswrapper[4771]: I0319 15:39:01.531109 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c065c328-37e2-4905-9d1e-82208eab196e","Type":"ContainerStarted","Data":"36d524adff342ba72b9364ccb05183dddad5974cccba9a6d3b848393efb41b19"} Mar 19 15:39:01 crc kubenswrapper[4771]: I0319 15:39:01.531607 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:39:01 crc kubenswrapper[4771]: I0319 15:39:01.531643 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 19 15:39:05 crc kubenswrapper[4771]: I0319 15:39:05.550519 4771 generic.go:334] "Generic (PLEG): container finished" podID="c065c328-37e2-4905-9d1e-82208eab196e" containerID="36d524adff342ba72b9364ccb05183dddad5974cccba9a6d3b848393efb41b19" exitCode=0 Mar 19 15:39:05 crc kubenswrapper[4771]: I0319 15:39:05.550631 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c065c328-37e2-4905-9d1e-82208eab196e","Type":"ContainerDied","Data":"36d524adff342ba72b9364ccb05183dddad5974cccba9a6d3b848393efb41b19"} Mar 19 15:39:05 crc kubenswrapper[4771]: I0319 15:39:05.551110 4771 scope.go:117] "RemoveContainer" containerID="b403cf2717f36e8851470f6fdbcc1e19ef4e3cc7642df5bd3bafb3e85709d78f" Mar 19 15:39:05 crc kubenswrapper[4771]: I0319 15:39:05.552171 4771 scope.go:117] "RemoveContainer" containerID="36d524adff342ba72b9364ccb05183dddad5974cccba9a6d3b848393efb41b19" Mar 19 15:39:05 crc kubenswrapper[4771]: E0319 15:39:05.552607 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:39:05 crc kubenswrapper[4771]: I0319 15:39:05.556335 4771 generic.go:334] "Generic (PLEG): container finished" podID="74c5f622-0ced-47f9-80d5-75a09acfafc0" containerID="944842aae1f1ed0bfdac88cfde25d0fea353437682bb62df1fc04652978c03b3" exitCode=0 Mar 19 15:39:05 crc kubenswrapper[4771]: I0319 15:39:05.556376 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74c5f622-0ced-47f9-80d5-75a09acfafc0","Type":"ContainerDied","Data":"944842aae1f1ed0bfdac88cfde25d0fea353437682bb62df1fc04652978c03b3"} Mar 19 15:39:05 crc kubenswrapper[4771]: I0319 15:39:05.556951 4771 scope.go:117] "RemoveContainer" containerID="944842aae1f1ed0bfdac88cfde25d0fea353437682bb62df1fc04652978c03b3" Mar 19 15:39:05 crc kubenswrapper[4771]: E0319 15:39:05.557307 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:39:05 crc kubenswrapper[4771]: I0319 15:39:05.617600 4771 scope.go:117] "RemoveContainer" containerID="87717d1d986a54696c6b6e88984501af1f290779afabd0f4ba366c1458b946ac" Mar 19 15:39:16 crc kubenswrapper[4771]: I0319 15:39:16.509134 4771 scope.go:117] "RemoveContainer" containerID="944842aae1f1ed0bfdac88cfde25d0fea353437682bb62df1fc04652978c03b3" Mar 19 15:39:16 crc kubenswrapper[4771]: E0319 15:39:16.509955 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:39:17 crc kubenswrapper[4771]: I0319 15:39:17.509461 4771 scope.go:117] "RemoveContainer" containerID="36d524adff342ba72b9364ccb05183dddad5974cccba9a6d3b848393efb41b19" Mar 19 15:39:17 crc kubenswrapper[4771]: E0319 15:39:17.509950 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:39:23 crc kubenswrapper[4771]: I0319 15:39:23.027737 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:39:23 crc kubenswrapper[4771]: I0319 15:39:23.028167 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:39:23 crc kubenswrapper[4771]: I0319 15:39:23.028225 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" Mar 19 15:39:23 crc kubenswrapper[4771]: I0319 15:39:23.029128 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9a7f9aaa7059458b50170d0fb5711aac81f8c848fea74597bd5d13bc3f12edd3"} pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 15:39:23 crc kubenswrapper[4771]: I0319 15:39:23.029210 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" containerID="cri-o://9a7f9aaa7059458b50170d0fb5711aac81f8c848fea74597bd5d13bc3f12edd3" gracePeriod=600 Mar 19 15:39:23 crc kubenswrapper[4771]: I0319 15:39:23.740593 4771 generic.go:334] "Generic (PLEG): container finished" podID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerID="9a7f9aaa7059458b50170d0fb5711aac81f8c848fea74597bd5d13bc3f12edd3" exitCode=0 Mar 19 15:39:23 crc kubenswrapper[4771]: I0319 15:39:23.740681 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" event={"ID":"f2b6e948-bbef-4217-b0eb-4cdbf711037c","Type":"ContainerDied","Data":"9a7f9aaa7059458b50170d0fb5711aac81f8c848fea74597bd5d13bc3f12edd3"} Mar 19 15:39:23 crc kubenswrapper[4771]: I0319 15:39:23.741111 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" event={"ID":"f2b6e948-bbef-4217-b0eb-4cdbf711037c","Type":"ContainerStarted","Data":"a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04"} Mar 19 15:39:23 crc kubenswrapper[4771]: I0319 15:39:23.741162 4771 scope.go:117] "RemoveContainer" containerID="953ae2c967341b87ceb085ed4fd2e6023f2aede65dc55a12a3a59811b0300199" Mar 19 15:39:27 crc kubenswrapper[4771]: I0319 15:39:27.508691 4771 scope.go:117] "RemoveContainer" containerID="944842aae1f1ed0bfdac88cfde25d0fea353437682bb62df1fc04652978c03b3" Mar 19 15:39:27 crc kubenswrapper[4771]: E0319 15:39:27.509265 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:39:29 crc kubenswrapper[4771]: I0319 15:39:29.508435 4771 scope.go:117] "RemoveContainer" containerID="36d524adff342ba72b9364ccb05183dddad5974cccba9a6d3b848393efb41b19" Mar 19 15:39:29 crc kubenswrapper[4771]: E0319 15:39:29.508981 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:39:42 crc kubenswrapper[4771]: I0319 15:39:42.509214 4771 scope.go:117] "RemoveContainer" containerID="944842aae1f1ed0bfdac88cfde25d0fea353437682bb62df1fc04652978c03b3" Mar 19 15:39:42 crc kubenswrapper[4771]: E0319 15:39:42.510348 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:39:43 crc kubenswrapper[4771]: I0319 15:39:43.508874 4771 scope.go:117] "RemoveContainer" containerID="36d524adff342ba72b9364ccb05183dddad5974cccba9a6d3b848393efb41b19" Mar 19 15:39:43 crc kubenswrapper[4771]: E0319 15:39:43.509508 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:39:54 crc kubenswrapper[4771]: I0319 15:39:54.508439 4771 scope.go:117] "RemoveContainer" containerID="944842aae1f1ed0bfdac88cfde25d0fea353437682bb62df1fc04652978c03b3" Mar 19 15:39:54 crc kubenswrapper[4771]: E0319 15:39:54.509407 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:39:56 crc kubenswrapper[4771]: I0319 15:39:56.509591 4771 scope.go:117] "RemoveContainer" containerID="36d524adff342ba72b9364ccb05183dddad5974cccba9a6d3b848393efb41b19" Mar 19 15:39:56 crc kubenswrapper[4771]: E0319 15:39:56.510130 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:40:00 crc kubenswrapper[4771]: I0319 15:40:00.137419 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565580-9kqg9"] Mar 19 15:40:00 crc kubenswrapper[4771]: E0319 15:40:00.138138 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae9e370e-3985-4fb9-83fd-b080b1c22ca0" containerName="oc" Mar 19 15:40:00 crc kubenswrapper[4771]: I0319 15:40:00.138156 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae9e370e-3985-4fb9-83fd-b080b1c22ca0" containerName="oc" Mar 19 15:40:00 crc kubenswrapper[4771]: I0319 15:40:00.138386 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae9e370e-3985-4fb9-83fd-b080b1c22ca0" containerName="oc" Mar 19 15:40:00 crc kubenswrapper[4771]: I0319 15:40:00.139035 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565580-9kqg9" Mar 19 15:40:00 crc kubenswrapper[4771]: I0319 15:40:00.142664 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 15:40:00 crc kubenswrapper[4771]: I0319 15:40:00.143103 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k42k7" Mar 19 15:40:00 crc kubenswrapper[4771]: I0319 15:40:00.144963 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 15:40:00 crc kubenswrapper[4771]: I0319 15:40:00.158177 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565580-9kqg9"] Mar 19 15:40:00 crc kubenswrapper[4771]: I0319 15:40:00.242502 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcpt4\" (UniqueName: \"kubernetes.io/projected/798cb9bf-2c53-499e-b648-9f9733743e23-kube-api-access-lcpt4\") pod \"auto-csr-approver-29565580-9kqg9\" (UID: \"798cb9bf-2c53-499e-b648-9f9733743e23\") " pod="openshift-infra/auto-csr-approver-29565580-9kqg9" Mar 19 15:40:00 crc kubenswrapper[4771]: I0319 15:40:00.343834 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcpt4\" (UniqueName: \"kubernetes.io/projected/798cb9bf-2c53-499e-b648-9f9733743e23-kube-api-access-lcpt4\") pod \"auto-csr-approver-29565580-9kqg9\" (UID: \"798cb9bf-2c53-499e-b648-9f9733743e23\") " pod="openshift-infra/auto-csr-approver-29565580-9kqg9" Mar 19 15:40:00 crc kubenswrapper[4771]: I0319 15:40:00.365153 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcpt4\" (UniqueName: \"kubernetes.io/projected/798cb9bf-2c53-499e-b648-9f9733743e23-kube-api-access-lcpt4\") pod \"auto-csr-approver-29565580-9kqg9\" (UID: \"798cb9bf-2c53-499e-b648-9f9733743e23\") " pod="openshift-infra/auto-csr-approver-29565580-9kqg9" Mar 19 15:40:00 crc kubenswrapper[4771]: I0319 15:40:00.539314 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565580-9kqg9" Mar 19 15:40:01 crc kubenswrapper[4771]: I0319 15:40:01.036458 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565580-9kqg9"] Mar 19 15:40:01 crc kubenswrapper[4771]: I0319 15:40:01.064259 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565580-9kqg9" event={"ID":"798cb9bf-2c53-499e-b648-9f9733743e23","Type":"ContainerStarted","Data":"5741e25630a1070e4f72718541ebad0178d0cab45617745cab24f0658de03a56"} Mar 19 15:40:03 crc kubenswrapper[4771]: I0319 15:40:03.082959 4771 generic.go:334] "Generic (PLEG): container finished" podID="798cb9bf-2c53-499e-b648-9f9733743e23" containerID="c98219a7ea9b7344e3da87cb392dfdcab8233d431a7d9c9e5dec2e007ecbebad" exitCode=0 Mar 19 15:40:03 crc kubenswrapper[4771]: I0319 15:40:03.083023 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565580-9kqg9" event={"ID":"798cb9bf-2c53-499e-b648-9f9733743e23","Type":"ContainerDied","Data":"c98219a7ea9b7344e3da87cb392dfdcab8233d431a7d9c9e5dec2e007ecbebad"} Mar 19 15:40:04 crc kubenswrapper[4771]: I0319 15:40:04.395778 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565580-9kqg9" Mar 19 15:40:04 crc kubenswrapper[4771]: I0319 15:40:04.512522 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcpt4\" (UniqueName: \"kubernetes.io/projected/798cb9bf-2c53-499e-b648-9f9733743e23-kube-api-access-lcpt4\") pod \"798cb9bf-2c53-499e-b648-9f9733743e23\" (UID: \"798cb9bf-2c53-499e-b648-9f9733743e23\") " Mar 19 15:40:04 crc kubenswrapper[4771]: I0319 15:40:04.519232 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/798cb9bf-2c53-499e-b648-9f9733743e23-kube-api-access-lcpt4" (OuterVolumeSpecName: "kube-api-access-lcpt4") pod "798cb9bf-2c53-499e-b648-9f9733743e23" (UID: "798cb9bf-2c53-499e-b648-9f9733743e23"). InnerVolumeSpecName "kube-api-access-lcpt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:40:04 crc kubenswrapper[4771]: I0319 15:40:04.615137 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcpt4\" (UniqueName: \"kubernetes.io/projected/798cb9bf-2c53-499e-b648-9f9733743e23-kube-api-access-lcpt4\") on node \"crc\" DevicePath \"\"" Mar 19 15:40:05 crc kubenswrapper[4771]: I0319 15:40:05.107140 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565580-9kqg9" event={"ID":"798cb9bf-2c53-499e-b648-9f9733743e23","Type":"ContainerDied","Data":"5741e25630a1070e4f72718541ebad0178d0cab45617745cab24f0658de03a56"} Mar 19 15:40:05 crc kubenswrapper[4771]: I0319 15:40:05.107408 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5741e25630a1070e4f72718541ebad0178d0cab45617745cab24f0658de03a56" Mar 19 15:40:05 crc kubenswrapper[4771]: I0319 15:40:05.107243 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565580-9kqg9" Mar 19 15:40:05 crc kubenswrapper[4771]: I0319 15:40:05.493056 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565574-p6k29"] Mar 19 15:40:05 crc kubenswrapper[4771]: I0319 15:40:05.505851 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565574-p6k29"] Mar 19 15:40:05 crc kubenswrapper[4771]: I0319 15:40:05.520624 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed7bd159-532b-4d06-9079-4a04e5a1c3a4" path="/var/lib/kubelet/pods/ed7bd159-532b-4d06-9079-4a04e5a1c3a4/volumes" Mar 19 15:40:09 crc kubenswrapper[4771]: I0319 15:40:09.509044 4771 scope.go:117] "RemoveContainer" containerID="944842aae1f1ed0bfdac88cfde25d0fea353437682bb62df1fc04652978c03b3" Mar 19 15:40:09 crc kubenswrapper[4771]: E0319 15:40:09.509608 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:40:09 crc kubenswrapper[4771]: I0319 15:40:09.526116 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pctmt"] Mar 19 15:40:09 crc kubenswrapper[4771]: E0319 15:40:09.526578 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798cb9bf-2c53-499e-b648-9f9733743e23" containerName="oc" Mar 19 15:40:09 crc kubenswrapper[4771]: I0319 15:40:09.526603 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="798cb9bf-2c53-499e-b648-9f9733743e23" containerName="oc" Mar 19 15:40:09 crc kubenswrapper[4771]: I0319 15:40:09.526874 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="798cb9bf-2c53-499e-b648-9f9733743e23" containerName="oc" Mar 19 15:40:09 crc kubenswrapper[4771]: I0319 15:40:09.528915 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pctmt" Mar 19 15:40:09 crc kubenswrapper[4771]: I0319 15:40:09.533981 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pctmt"] Mar 19 15:40:09 crc kubenswrapper[4771]: I0319 15:40:09.601524 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmlgj\" (UniqueName: \"kubernetes.io/projected/7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976-kube-api-access-wmlgj\") pod \"community-operators-pctmt\" (UID: \"7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976\") " pod="openshift-marketplace/community-operators-pctmt" Mar 19 15:40:09 crc kubenswrapper[4771]: I0319 15:40:09.601591 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976-catalog-content\") pod \"community-operators-pctmt\" (UID: \"7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976\") " pod="openshift-marketplace/community-operators-pctmt" Mar 19 15:40:09 crc kubenswrapper[4771]: I0319 15:40:09.601796 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976-utilities\") pod \"community-operators-pctmt\" (UID: \"7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976\") " pod="openshift-marketplace/community-operators-pctmt" Mar 19 15:40:09 crc kubenswrapper[4771]: I0319 15:40:09.703021 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976-catalog-content\") pod \"community-operators-pctmt\" (UID: \"7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976\") " pod="openshift-marketplace/community-operators-pctmt" Mar 19 15:40:09 crc kubenswrapper[4771]: I0319 15:40:09.703274 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976-utilities\") pod \"community-operators-pctmt\" (UID: \"7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976\") " pod="openshift-marketplace/community-operators-pctmt" Mar 19 15:40:09 crc kubenswrapper[4771]: I0319 15:40:09.703341 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmlgj\" (UniqueName: \"kubernetes.io/projected/7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976-kube-api-access-wmlgj\") pod \"community-operators-pctmt\" (UID: \"7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976\") " pod="openshift-marketplace/community-operators-pctmt" Mar 19 15:40:09 crc kubenswrapper[4771]: I0319 15:40:09.703668 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976-catalog-content\") pod \"community-operators-pctmt\" (UID: \"7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976\") " pod="openshift-marketplace/community-operators-pctmt" Mar 19 15:40:09 crc kubenswrapper[4771]: I0319 15:40:09.703895 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976-utilities\") pod \"community-operators-pctmt\" (UID: \"7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976\") " pod="openshift-marketplace/community-operators-pctmt" Mar 19 15:40:09 crc kubenswrapper[4771]: I0319 15:40:09.723719 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmlgj\" (UniqueName: \"kubernetes.io/projected/7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976-kube-api-access-wmlgj\") pod \"community-operators-pctmt\" (UID: \"7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976\") " pod="openshift-marketplace/community-operators-pctmt" Mar 19 15:40:09 crc kubenswrapper[4771]: I0319 15:40:09.860812 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pctmt" Mar 19 15:40:10 crc kubenswrapper[4771]: I0319 15:40:10.380298 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pctmt"] Mar 19 15:40:11 crc kubenswrapper[4771]: I0319 15:40:11.177075 4771 generic.go:334] "Generic (PLEG): container finished" podID="7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976" containerID="fc32300101a50612f07c47bcf3c163601d30edcf56e57b2347271a05e33668bb" exitCode=0 Mar 19 15:40:11 crc kubenswrapper[4771]: I0319 15:40:11.177130 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pctmt" event={"ID":"7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976","Type":"ContainerDied","Data":"fc32300101a50612f07c47bcf3c163601d30edcf56e57b2347271a05e33668bb"} Mar 19 15:40:11 crc kubenswrapper[4771]: I0319 15:40:11.177437 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pctmt" event={"ID":"7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976","Type":"ContainerStarted","Data":"6f71d5d5510b595692b01bb597b474208b36a8f51c52f327eda441aa6723e6d5"} Mar 19 15:40:11 crc kubenswrapper[4771]: I0319 15:40:11.516184 4771 scope.go:117] "RemoveContainer" containerID="36d524adff342ba72b9364ccb05183dddad5974cccba9a6d3b848393efb41b19" Mar 19 15:40:11 crc kubenswrapper[4771]: E0319 15:40:11.516898 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:40:12 crc kubenswrapper[4771]: I0319 15:40:12.190066 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pctmt" event={"ID":"7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976","Type":"ContainerStarted","Data":"b7dcf638c6b6e9e73f7276ec233e1468bb5630c1056d1557ff66d29709a9f89c"} Mar 19 15:40:13 crc kubenswrapper[4771]: I0319 15:40:13.198661 4771 generic.go:334] "Generic (PLEG): container finished" podID="7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976" containerID="b7dcf638c6b6e9e73f7276ec233e1468bb5630c1056d1557ff66d29709a9f89c" exitCode=0 Mar 19 15:40:13 crc kubenswrapper[4771]: I0319 15:40:13.198781 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pctmt" event={"ID":"7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976","Type":"ContainerDied","Data":"b7dcf638c6b6e9e73f7276ec233e1468bb5630c1056d1557ff66d29709a9f89c"} Mar 19 15:40:14 crc kubenswrapper[4771]: I0319 15:40:14.214928 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pctmt" event={"ID":"7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976","Type":"ContainerStarted","Data":"cb8c2cfb441a0b43f283f1aad2755a5e9e9abf88b29a462dcac88ffa48ee2c3b"} Mar 19 15:40:14 crc kubenswrapper[4771]: I0319 15:40:14.246439 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pctmt" podStartSLOduration=2.8324305279999997 podStartE2EDuration="5.246408746s" podCreationTimestamp="2026-03-19 15:40:09 +0000 UTC" firstStartedPulling="2026-03-19 15:40:11.179135454 +0000 UTC m=+1470.407756656" lastFinishedPulling="2026-03-19 15:40:13.593113652 +0000 UTC m=+1472.821734874" observedRunningTime="2026-03-19 15:40:14.243537468 +0000 UTC m=+1473.472158720" watchObservedRunningTime="2026-03-19 15:40:14.246408746 +0000 UTC m=+1473.475029988" Mar 19 15:40:15 crc kubenswrapper[4771]: I0319 15:40:15.713365 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v4ppz"] Mar 19 15:40:15 crc kubenswrapper[4771]: I0319 15:40:15.716486 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v4ppz" Mar 19 15:40:15 crc kubenswrapper[4771]: I0319 15:40:15.727345 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v4ppz"] Mar 19 15:40:15 crc kubenswrapper[4771]: I0319 15:40:15.817288 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4fd828f-9d29-435f-8950-3bc663193824-utilities\") pod \"redhat-operators-v4ppz\" (UID: \"e4fd828f-9d29-435f-8950-3bc663193824\") " pod="openshift-marketplace/redhat-operators-v4ppz" Mar 19 15:40:15 crc kubenswrapper[4771]: I0319 15:40:15.817548 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85xml\" (UniqueName: \"kubernetes.io/projected/e4fd828f-9d29-435f-8950-3bc663193824-kube-api-access-85xml\") pod \"redhat-operators-v4ppz\" (UID: \"e4fd828f-9d29-435f-8950-3bc663193824\") " pod="openshift-marketplace/redhat-operators-v4ppz" Mar 19 15:40:15 crc kubenswrapper[4771]: I0319 15:40:15.817777 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4fd828f-9d29-435f-8950-3bc663193824-catalog-content\") pod \"redhat-operators-v4ppz\" (UID: \"e4fd828f-9d29-435f-8950-3bc663193824\") " pod="openshift-marketplace/redhat-operators-v4ppz" Mar 19 15:40:15 crc kubenswrapper[4771]: I0319 15:40:15.920072 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4fd828f-9d29-435f-8950-3bc663193824-utilities\") pod \"redhat-operators-v4ppz\" (UID: \"e4fd828f-9d29-435f-8950-3bc663193824\") " pod="openshift-marketplace/redhat-operators-v4ppz" Mar 19 15:40:15 crc kubenswrapper[4771]: I0319 15:40:15.920311 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85xml\" (UniqueName: \"kubernetes.io/projected/e4fd828f-9d29-435f-8950-3bc663193824-kube-api-access-85xml\") pod \"redhat-operators-v4ppz\" (UID: \"e4fd828f-9d29-435f-8950-3bc663193824\") " pod="openshift-marketplace/redhat-operators-v4ppz" Mar 19 15:40:15 crc kubenswrapper[4771]: I0319 15:40:15.920425 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4fd828f-9d29-435f-8950-3bc663193824-catalog-content\") pod \"redhat-operators-v4ppz\" (UID: \"e4fd828f-9d29-435f-8950-3bc663193824\") " pod="openshift-marketplace/redhat-operators-v4ppz" Mar 19 15:40:15 crc kubenswrapper[4771]: I0319 15:40:15.920698 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4fd828f-9d29-435f-8950-3bc663193824-utilities\") pod \"redhat-operators-v4ppz\" (UID: \"e4fd828f-9d29-435f-8950-3bc663193824\") " pod="openshift-marketplace/redhat-operators-v4ppz" Mar 19 15:40:15 crc kubenswrapper[4771]: I0319 15:40:15.920956 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4fd828f-9d29-435f-8950-3bc663193824-catalog-content\") pod \"redhat-operators-v4ppz\" (UID: \"e4fd828f-9d29-435f-8950-3bc663193824\") " pod="openshift-marketplace/redhat-operators-v4ppz" Mar 19 15:40:15 crc kubenswrapper[4771]: I0319 15:40:15.958013 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85xml\" (UniqueName: \"kubernetes.io/projected/e4fd828f-9d29-435f-8950-3bc663193824-kube-api-access-85xml\") pod \"redhat-operators-v4ppz\" (UID: \"e4fd828f-9d29-435f-8950-3bc663193824\") " pod="openshift-marketplace/redhat-operators-v4ppz" Mar 19 15:40:16 crc kubenswrapper[4771]: I0319 15:40:16.042522 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v4ppz" Mar 19 15:40:16 crc kubenswrapper[4771]: I0319 15:40:16.518613 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v4ppz"] Mar 19 15:40:16 crc kubenswrapper[4771]: W0319 15:40:16.519940 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4fd828f_9d29_435f_8950_3bc663193824.slice/crio-553aa691198233db2c58803b49024174f7a0b6594f254eb88e0242901ac59f36 WatchSource:0}: Error finding container 553aa691198233db2c58803b49024174f7a0b6594f254eb88e0242901ac59f36: Status 404 returned error can't find the container with id 553aa691198233db2c58803b49024174f7a0b6594f254eb88e0242901ac59f36 Mar 19 15:40:17 crc kubenswrapper[4771]: I0319 15:40:17.239820 4771 generic.go:334] "Generic (PLEG): container finished" podID="e4fd828f-9d29-435f-8950-3bc663193824" containerID="d1cbb6ed7d4c08a834d64be6e64cdb172d37929a6e376aa3e503addb403fd0a1" exitCode=0 Mar 19 15:40:17 crc kubenswrapper[4771]: I0319 15:40:17.239934 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v4ppz" event={"ID":"e4fd828f-9d29-435f-8950-3bc663193824","Type":"ContainerDied","Data":"d1cbb6ed7d4c08a834d64be6e64cdb172d37929a6e376aa3e503addb403fd0a1"} Mar 19 15:40:17 crc kubenswrapper[4771]: I0319 15:40:17.240211 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v4ppz" event={"ID":"e4fd828f-9d29-435f-8950-3bc663193824","Type":"ContainerStarted","Data":"553aa691198233db2c58803b49024174f7a0b6594f254eb88e0242901ac59f36"} Mar 19 15:40:19 crc kubenswrapper[4771]: I0319 15:40:19.861422 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pctmt" Mar 19 15:40:19 crc kubenswrapper[4771]: I0319 15:40:19.862228 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pctmt" Mar 19 15:40:19 crc kubenswrapper[4771]: I0319 15:40:19.912571 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pctmt" Mar 19 15:40:20 crc kubenswrapper[4771]: I0319 15:40:20.335820 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pctmt" Mar 19 15:40:21 crc kubenswrapper[4771]: I0319 15:40:21.276678 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v4ppz" event={"ID":"e4fd828f-9d29-435f-8950-3bc663193824","Type":"ContainerStarted","Data":"6be2a209cbc22e3c6977e8734bfa1ccace49d5bed6d1e93cfaaa772da9b49a61"} Mar 19 15:40:23 crc kubenswrapper[4771]: I0319 15:40:23.104348 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pctmt"] Mar 19 15:40:23 crc kubenswrapper[4771]: E0319 15:40:23.279743 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4fd828f_9d29_435f_8950_3bc663193824.slice/crio-conmon-6be2a209cbc22e3c6977e8734bfa1ccace49d5bed6d1e93cfaaa772da9b49a61.scope\": RecentStats: unable to find data in memory cache]" Mar 19 15:40:23 crc kubenswrapper[4771]: I0319 15:40:23.301655 4771 generic.go:334] "Generic (PLEG): container finished" podID="e4fd828f-9d29-435f-8950-3bc663193824" containerID="6be2a209cbc22e3c6977e8734bfa1ccace49d5bed6d1e93cfaaa772da9b49a61" exitCode=0 Mar 19 15:40:23 crc kubenswrapper[4771]: I0319 15:40:23.301701 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v4ppz" event={"ID":"e4fd828f-9d29-435f-8950-3bc663193824","Type":"ContainerDied","Data":"6be2a209cbc22e3c6977e8734bfa1ccace49d5bed6d1e93cfaaa772da9b49a61"} Mar 19 15:40:23 crc kubenswrapper[4771]: I0319 15:40:23.301828 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pctmt" podUID="7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976" containerName="registry-server" containerID="cri-o://cb8c2cfb441a0b43f283f1aad2755a5e9e9abf88b29a462dcac88ffa48ee2c3b" gracePeriod=2 Mar 19 15:40:24 crc kubenswrapper[4771]: I0319 15:40:24.509189 4771 scope.go:117] "RemoveContainer" containerID="36d524adff342ba72b9364ccb05183dddad5974cccba9a6d3b848393efb41b19" Mar 19 15:40:24 crc kubenswrapper[4771]: E0319 15:40:24.509421 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:40:24 crc kubenswrapper[4771]: I0319 15:40:24.509945 4771 scope.go:117] "RemoveContainer" containerID="944842aae1f1ed0bfdac88cfde25d0fea353437682bb62df1fc04652978c03b3" Mar 19 15:40:24 crc kubenswrapper[4771]: E0319 15:40:24.510184 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:40:25 crc kubenswrapper[4771]: I0319 15:40:25.323712 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v4ppz" event={"ID":"e4fd828f-9d29-435f-8950-3bc663193824","Type":"ContainerStarted","Data":"81e5da4fffe38685455fdafe26e8f65f4ca2f0c393f84b0041c22adda5289341"} Mar 19 15:40:25 crc kubenswrapper[4771]: I0319 15:40:25.326385 4771 generic.go:334] "Generic (PLEG): container finished" podID="7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976" containerID="cb8c2cfb441a0b43f283f1aad2755a5e9e9abf88b29a462dcac88ffa48ee2c3b" exitCode=0 Mar 19 15:40:25 crc kubenswrapper[4771]: I0319 15:40:25.326419 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pctmt" event={"ID":"7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976","Type":"ContainerDied","Data":"cb8c2cfb441a0b43f283f1aad2755a5e9e9abf88b29a462dcac88ffa48ee2c3b"} Mar 19 15:40:25 crc kubenswrapper[4771]: I0319 15:40:25.357007 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v4ppz" podStartSLOduration=2.935647477 podStartE2EDuration="10.356964647s" podCreationTimestamp="2026-03-19 15:40:15 +0000 UTC" firstStartedPulling="2026-03-19 15:40:17.241447649 +0000 UTC m=+1476.470068871" lastFinishedPulling="2026-03-19 15:40:24.662764829 +0000 UTC m=+1483.891386041" observedRunningTime="2026-03-19 15:40:25.354294733 +0000 UTC m=+1484.582915935" watchObservedRunningTime="2026-03-19 15:40:25.356964647 +0000 UTC m=+1484.585585849" Mar 19 15:40:25 crc kubenswrapper[4771]: I0319 15:40:25.544561 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pctmt" Mar 19 15:40:25 crc kubenswrapper[4771]: I0319 15:40:25.729095 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmlgj\" (UniqueName: \"kubernetes.io/projected/7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976-kube-api-access-wmlgj\") pod \"7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976\" (UID: \"7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976\") " Mar 19 15:40:25 crc kubenswrapper[4771]: I0319 15:40:25.729189 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976-utilities\") pod \"7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976\" (UID: \"7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976\") " Mar 19 15:40:25 crc kubenswrapper[4771]: I0319 15:40:25.729307 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976-catalog-content\") pod \"7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976\" (UID: \"7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976\") " Mar 19 15:40:25 crc kubenswrapper[4771]: I0319 15:40:25.734182 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976-utilities" (OuterVolumeSpecName: "utilities") pod "7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976" (UID: "7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:40:25 crc kubenswrapper[4771]: I0319 15:40:25.739358 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976-kube-api-access-wmlgj" (OuterVolumeSpecName: "kube-api-access-wmlgj") pod "7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976" (UID: "7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976"). InnerVolumeSpecName "kube-api-access-wmlgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:40:25 crc kubenswrapper[4771]: I0319 15:40:25.797370 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976" (UID: "7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:40:25 crc kubenswrapper[4771]: I0319 15:40:25.831501 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmlgj\" (UniqueName: \"kubernetes.io/projected/7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976-kube-api-access-wmlgj\") on node \"crc\" DevicePath \"\"" Mar 19 15:40:25 crc kubenswrapper[4771]: I0319 15:40:25.831552 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 15:40:25 crc kubenswrapper[4771]: I0319 15:40:25.831599 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 15:40:26 crc kubenswrapper[4771]: I0319 15:40:26.045298 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v4ppz" Mar 19 15:40:26 crc kubenswrapper[4771]: I0319 15:40:26.045388 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v4ppz" Mar 19 15:40:26 crc kubenswrapper[4771]: I0319 15:40:26.339364 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pctmt" Mar 19 15:40:26 crc kubenswrapper[4771]: I0319 15:40:26.339387 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pctmt" event={"ID":"7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976","Type":"ContainerDied","Data":"6f71d5d5510b595692b01bb597b474208b36a8f51c52f327eda441aa6723e6d5"} Mar 19 15:40:26 crc kubenswrapper[4771]: I0319 15:40:26.339526 4771 scope.go:117] "RemoveContainer" containerID="cb8c2cfb441a0b43f283f1aad2755a5e9e9abf88b29a462dcac88ffa48ee2c3b" Mar 19 15:40:26 crc kubenswrapper[4771]: I0319 15:40:26.359206 4771 scope.go:117] "RemoveContainer" containerID="b7dcf638c6b6e9e73f7276ec233e1468bb5630c1056d1557ff66d29709a9f89c" Mar 19 15:40:26 crc kubenswrapper[4771]: I0319 15:40:26.378414 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pctmt"] Mar 19 15:40:26 crc kubenswrapper[4771]: I0319 15:40:26.385397 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pctmt"] Mar 19 15:40:26 crc kubenswrapper[4771]: I0319 15:40:26.388055 4771 scope.go:117] "RemoveContainer" containerID="fc32300101a50612f07c47bcf3c163601d30edcf56e57b2347271a05e33668bb" Mar 19 15:40:27 crc kubenswrapper[4771]: I0319 15:40:27.126963 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v4ppz" podUID="e4fd828f-9d29-435f-8950-3bc663193824" containerName="registry-server" probeResult="failure" output=< Mar 19 15:40:27 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Mar 19 15:40:27 crc kubenswrapper[4771]: > Mar 19 15:40:27 crc kubenswrapper[4771]: I0319 15:40:27.521928 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976" path="/var/lib/kubelet/pods/7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976/volumes" Mar 19 15:40:36 crc kubenswrapper[4771]: I0319 15:40:36.097713 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v4ppz" Mar 19 15:40:36 crc kubenswrapper[4771]: I0319 15:40:36.170003 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v4ppz" Mar 19 15:40:36 crc kubenswrapper[4771]: I0319 15:40:36.344658 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v4ppz"] Mar 19 15:40:36 crc kubenswrapper[4771]: I0319 15:40:36.508627 4771 scope.go:117] "RemoveContainer" containerID="944842aae1f1ed0bfdac88cfde25d0fea353437682bb62df1fc04652978c03b3" Mar 19 15:40:36 crc kubenswrapper[4771]: E0319 15:40:36.509110 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:40:37 crc kubenswrapper[4771]: I0319 15:40:37.473131 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v4ppz" podUID="e4fd828f-9d29-435f-8950-3bc663193824" containerName="registry-server" containerID="cri-o://81e5da4fffe38685455fdafe26e8f65f4ca2f0c393f84b0041c22adda5289341" gracePeriod=2 Mar 19 15:40:37 crc kubenswrapper[4771]: I0319 15:40:37.508816 4771 scope.go:117] "RemoveContainer" containerID="36d524adff342ba72b9364ccb05183dddad5974cccba9a6d3b848393efb41b19" Mar 19 15:40:37 crc kubenswrapper[4771]: E0319 15:40:37.509454 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:40:37 crc kubenswrapper[4771]: I0319 15:40:37.952589 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v4ppz" Mar 19 15:40:38 crc kubenswrapper[4771]: I0319 15:40:38.057755 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4fd828f-9d29-435f-8950-3bc663193824-catalog-content\") pod \"e4fd828f-9d29-435f-8950-3bc663193824\" (UID: \"e4fd828f-9d29-435f-8950-3bc663193824\") " Mar 19 15:40:38 crc kubenswrapper[4771]: I0319 15:40:38.057832 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4fd828f-9d29-435f-8950-3bc663193824-utilities\") pod \"e4fd828f-9d29-435f-8950-3bc663193824\" (UID: \"e4fd828f-9d29-435f-8950-3bc663193824\") " Mar 19 15:40:38 crc kubenswrapper[4771]: I0319 15:40:38.057871 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85xml\" (UniqueName: \"kubernetes.io/projected/e4fd828f-9d29-435f-8950-3bc663193824-kube-api-access-85xml\") pod \"e4fd828f-9d29-435f-8950-3bc663193824\" (UID: \"e4fd828f-9d29-435f-8950-3bc663193824\") " Mar 19 15:40:38 crc kubenswrapper[4771]: I0319 15:40:38.058869 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4fd828f-9d29-435f-8950-3bc663193824-utilities" (OuterVolumeSpecName: "utilities") pod "e4fd828f-9d29-435f-8950-3bc663193824" (UID: "e4fd828f-9d29-435f-8950-3bc663193824"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:40:38 crc kubenswrapper[4771]: I0319 15:40:38.059267 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4fd828f-9d29-435f-8950-3bc663193824-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 15:40:38 crc kubenswrapper[4771]: I0319 15:40:38.064389 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4fd828f-9d29-435f-8950-3bc663193824-kube-api-access-85xml" (OuterVolumeSpecName: "kube-api-access-85xml") pod "e4fd828f-9d29-435f-8950-3bc663193824" (UID: "e4fd828f-9d29-435f-8950-3bc663193824"). InnerVolumeSpecName "kube-api-access-85xml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:40:38 crc kubenswrapper[4771]: I0319 15:40:38.161288 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85xml\" (UniqueName: \"kubernetes.io/projected/e4fd828f-9d29-435f-8950-3bc663193824-kube-api-access-85xml\") on node \"crc\" DevicePath \"\"" Mar 19 15:40:38 crc kubenswrapper[4771]: I0319 15:40:38.218013 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4fd828f-9d29-435f-8950-3bc663193824-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4fd828f-9d29-435f-8950-3bc663193824" (UID: "e4fd828f-9d29-435f-8950-3bc663193824"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:40:38 crc kubenswrapper[4771]: I0319 15:40:38.263500 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4fd828f-9d29-435f-8950-3bc663193824-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 15:40:38 crc kubenswrapper[4771]: I0319 15:40:38.483679 4771 generic.go:334] "Generic (PLEG): container finished" podID="e4fd828f-9d29-435f-8950-3bc663193824" containerID="81e5da4fffe38685455fdafe26e8f65f4ca2f0c393f84b0041c22adda5289341" exitCode=0 Mar 19 15:40:38 crc kubenswrapper[4771]: I0319 15:40:38.483728 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v4ppz" event={"ID":"e4fd828f-9d29-435f-8950-3bc663193824","Type":"ContainerDied","Data":"81e5da4fffe38685455fdafe26e8f65f4ca2f0c393f84b0041c22adda5289341"} Mar 19 15:40:38 crc kubenswrapper[4771]: I0319 15:40:38.483759 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v4ppz" event={"ID":"e4fd828f-9d29-435f-8950-3bc663193824","Type":"ContainerDied","Data":"553aa691198233db2c58803b49024174f7a0b6594f254eb88e0242901ac59f36"} Mar 19 15:40:38 crc kubenswrapper[4771]: I0319 15:40:38.483781 4771 scope.go:117] "RemoveContainer" containerID="81e5da4fffe38685455fdafe26e8f65f4ca2f0c393f84b0041c22adda5289341" Mar 19 15:40:38 crc kubenswrapper[4771]: I0319 15:40:38.483811 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v4ppz" Mar 19 15:40:38 crc kubenswrapper[4771]: I0319 15:40:38.504661 4771 scope.go:117] "RemoveContainer" containerID="6be2a209cbc22e3c6977e8734bfa1ccace49d5bed6d1e93cfaaa772da9b49a61" Mar 19 15:40:38 crc kubenswrapper[4771]: I0319 15:40:38.514238 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v4ppz"] Mar 19 15:40:38 crc kubenswrapper[4771]: I0319 15:40:38.521547 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v4ppz"] Mar 19 15:40:38 crc kubenswrapper[4771]: I0319 15:40:38.540330 4771 scope.go:117] "RemoveContainer" containerID="d1cbb6ed7d4c08a834d64be6e64cdb172d37929a6e376aa3e503addb403fd0a1" Mar 19 15:40:38 crc kubenswrapper[4771]: I0319 15:40:38.566322 4771 scope.go:117] "RemoveContainer" containerID="81e5da4fffe38685455fdafe26e8f65f4ca2f0c393f84b0041c22adda5289341" Mar 19 15:40:38 crc kubenswrapper[4771]: E0319 15:40:38.566966 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81e5da4fffe38685455fdafe26e8f65f4ca2f0c393f84b0041c22adda5289341\": container with ID starting with 81e5da4fffe38685455fdafe26e8f65f4ca2f0c393f84b0041c22adda5289341 not found: ID does not exist" containerID="81e5da4fffe38685455fdafe26e8f65f4ca2f0c393f84b0041c22adda5289341" Mar 19 15:40:38 crc kubenswrapper[4771]: I0319 15:40:38.567045 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81e5da4fffe38685455fdafe26e8f65f4ca2f0c393f84b0041c22adda5289341"} err="failed to get container status \"81e5da4fffe38685455fdafe26e8f65f4ca2f0c393f84b0041c22adda5289341\": rpc error: code = NotFound desc = could not find container \"81e5da4fffe38685455fdafe26e8f65f4ca2f0c393f84b0041c22adda5289341\": container with ID starting with 81e5da4fffe38685455fdafe26e8f65f4ca2f0c393f84b0041c22adda5289341 not found: ID does not exist" Mar 19 15:40:38 crc kubenswrapper[4771]: I0319 15:40:38.567079 4771 scope.go:117] "RemoveContainer" containerID="6be2a209cbc22e3c6977e8734bfa1ccace49d5bed6d1e93cfaaa772da9b49a61" Mar 19 15:40:38 crc kubenswrapper[4771]: E0319 15:40:38.567533 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6be2a209cbc22e3c6977e8734bfa1ccace49d5bed6d1e93cfaaa772da9b49a61\": container with ID starting with 6be2a209cbc22e3c6977e8734bfa1ccace49d5bed6d1e93cfaaa772da9b49a61 not found: ID does not exist" containerID="6be2a209cbc22e3c6977e8734bfa1ccace49d5bed6d1e93cfaaa772da9b49a61" Mar 19 15:40:38 crc kubenswrapper[4771]: I0319 15:40:38.567578 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6be2a209cbc22e3c6977e8734bfa1ccace49d5bed6d1e93cfaaa772da9b49a61"} err="failed to get container status \"6be2a209cbc22e3c6977e8734bfa1ccace49d5bed6d1e93cfaaa772da9b49a61\": rpc error: code = NotFound desc = could not find container \"6be2a209cbc22e3c6977e8734bfa1ccace49d5bed6d1e93cfaaa772da9b49a61\": container with ID starting with 6be2a209cbc22e3c6977e8734bfa1ccace49d5bed6d1e93cfaaa772da9b49a61 not found: ID does not exist" Mar 19 15:40:38 crc kubenswrapper[4771]: I0319 15:40:38.567595 4771 scope.go:117] "RemoveContainer" containerID="d1cbb6ed7d4c08a834d64be6e64cdb172d37929a6e376aa3e503addb403fd0a1" Mar 19 15:40:38 crc kubenswrapper[4771]: E0319 15:40:38.567891 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1cbb6ed7d4c08a834d64be6e64cdb172d37929a6e376aa3e503addb403fd0a1\": container with ID starting with d1cbb6ed7d4c08a834d64be6e64cdb172d37929a6e376aa3e503addb403fd0a1 not found: ID does not exist" containerID="d1cbb6ed7d4c08a834d64be6e64cdb172d37929a6e376aa3e503addb403fd0a1" Mar 19 15:40:38 crc kubenswrapper[4771]: I0319 15:40:38.567951 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1cbb6ed7d4c08a834d64be6e64cdb172d37929a6e376aa3e503addb403fd0a1"} err="failed to get container status \"d1cbb6ed7d4c08a834d64be6e64cdb172d37929a6e376aa3e503addb403fd0a1\": rpc error: code = NotFound desc = could not find container \"d1cbb6ed7d4c08a834d64be6e64cdb172d37929a6e376aa3e503addb403fd0a1\": container with ID starting with d1cbb6ed7d4c08a834d64be6e64cdb172d37929a6e376aa3e503addb403fd0a1 not found: ID does not exist" Mar 19 15:40:39 crc kubenswrapper[4771]: I0319 15:40:39.517777 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4fd828f-9d29-435f-8950-3bc663193824" path="/var/lib/kubelet/pods/e4fd828f-9d29-435f-8950-3bc663193824/volumes" Mar 19 15:40:48 crc kubenswrapper[4771]: I0319 15:40:48.509637 4771 scope.go:117] "RemoveContainer" containerID="944842aae1f1ed0bfdac88cfde25d0fea353437682bb62df1fc04652978c03b3" Mar 19 15:40:48 crc kubenswrapper[4771]: E0319 15:40:48.510774 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:40:52 crc kubenswrapper[4771]: I0319 15:40:52.509222 4771 scope.go:117] "RemoveContainer" containerID="36d524adff342ba72b9364ccb05183dddad5974cccba9a6d3b848393efb41b19" Mar 19 15:40:52 crc kubenswrapper[4771]: E0319 15:40:52.510004 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:40:57 crc kubenswrapper[4771]: I0319 15:40:57.235600 4771 scope.go:117] "RemoveContainer" containerID="48d42026c3f1b277a357c562ccc6caf5fa451542a3c29a2b3e52d367b9067165" Mar 19 15:41:00 crc kubenswrapper[4771]: I0319 15:41:00.509308 4771 scope.go:117] "RemoveContainer" containerID="944842aae1f1ed0bfdac88cfde25d0fea353437682bb62df1fc04652978c03b3" Mar 19 15:41:00 crc kubenswrapper[4771]: E0319 15:41:00.510299 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:41:07 crc kubenswrapper[4771]: I0319 15:41:07.509361 4771 scope.go:117] "RemoveContainer" containerID="36d524adff342ba72b9364ccb05183dddad5974cccba9a6d3b848393efb41b19" Mar 19 15:41:07 crc kubenswrapper[4771]: E0319 15:41:07.510671 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:41:13 crc kubenswrapper[4771]: I0319 15:41:13.509029 4771 scope.go:117] "RemoveContainer" containerID="944842aae1f1ed0bfdac88cfde25d0fea353437682bb62df1fc04652978c03b3" Mar 19 15:41:13 crc kubenswrapper[4771]: E0319 15:41:13.509918 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:41:18 crc kubenswrapper[4771]: I0319 15:41:18.509120 4771 scope.go:117] "RemoveContainer" containerID="36d524adff342ba72b9364ccb05183dddad5974cccba9a6d3b848393efb41b19" Mar 19 15:41:18 crc kubenswrapper[4771]: E0319 15:41:18.510017 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:41:23 crc kubenswrapper[4771]: I0319 15:41:23.027979 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:41:23 crc kubenswrapper[4771]: I0319 15:41:23.028584 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:41:24 crc kubenswrapper[4771]: I0319 15:41:24.508388 4771 scope.go:117] "RemoveContainer" containerID="944842aae1f1ed0bfdac88cfde25d0fea353437682bb62df1fc04652978c03b3" Mar 19 15:41:24 crc kubenswrapper[4771]: E0319 15:41:24.508733 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:41:29 crc kubenswrapper[4771]: I0319 15:41:29.509087 4771 scope.go:117] "RemoveContainer" containerID="36d524adff342ba72b9364ccb05183dddad5974cccba9a6d3b848393efb41b19" Mar 19 15:41:29 crc kubenswrapper[4771]: E0319 15:41:29.509751 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:41:35 crc kubenswrapper[4771]: I0319 15:41:35.509434 4771 scope.go:117] "RemoveContainer" containerID="944842aae1f1ed0bfdac88cfde25d0fea353437682bb62df1fc04652978c03b3" Mar 19 15:41:35 crc kubenswrapper[4771]: E0319 15:41:35.510199 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:41:40 crc kubenswrapper[4771]: I0319 15:41:40.509322 4771 scope.go:117] "RemoveContainer" containerID="36d524adff342ba72b9364ccb05183dddad5974cccba9a6d3b848393efb41b19" Mar 19 15:41:40 crc kubenswrapper[4771]: E0319 15:41:40.509801 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:41:50 crc kubenswrapper[4771]: I0319 15:41:50.510309 4771 scope.go:117] "RemoveContainer" containerID="944842aae1f1ed0bfdac88cfde25d0fea353437682bb62df1fc04652978c03b3" Mar 19 15:41:51 crc kubenswrapper[4771]: I0319 15:41:51.153766 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74c5f622-0ced-47f9-80d5-75a09acfafc0","Type":"ContainerStarted","Data":"10cd27af851acc4fdf348971ca1d9cea1c90bae14f8703a63099e2307768de01"} Mar 19 15:41:51 crc kubenswrapper[4771]: I0319 15:41:51.154310 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 19 15:41:53 crc kubenswrapper[4771]: I0319 15:41:53.026944 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:41:53 crc kubenswrapper[4771]: I0319 15:41:53.027297 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:41:54 crc kubenswrapper[4771]: I0319 15:41:54.508779 4771 scope.go:117] "RemoveContainer" containerID="36d524adff342ba72b9364ccb05183dddad5974cccba9a6d3b848393efb41b19" Mar 19 15:41:55 crc kubenswrapper[4771]: I0319 15:41:55.186705 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c065c328-37e2-4905-9d1e-82208eab196e","Type":"ContainerStarted","Data":"ae049e51f0eb6aef961237eaaf1901a05adc3ea2bc6e58c7885aff01dfec9f67"} Mar 19 15:41:55 crc kubenswrapper[4771]: I0319 15:41:55.187679 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:41:55 crc kubenswrapper[4771]: I0319 15:41:55.189958 4771 generic.go:334] "Generic (PLEG): container finished" podID="74c5f622-0ced-47f9-80d5-75a09acfafc0" containerID="10cd27af851acc4fdf348971ca1d9cea1c90bae14f8703a63099e2307768de01" exitCode=0 Mar 19 15:41:55 crc kubenswrapper[4771]: I0319 15:41:55.190023 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74c5f622-0ced-47f9-80d5-75a09acfafc0","Type":"ContainerDied","Data":"10cd27af851acc4fdf348971ca1d9cea1c90bae14f8703a63099e2307768de01"} Mar 19 15:41:55 crc kubenswrapper[4771]: I0319 15:41:55.190198 4771 scope.go:117] "RemoveContainer" containerID="944842aae1f1ed0bfdac88cfde25d0fea353437682bb62df1fc04652978c03b3" Mar 19 15:41:55 crc kubenswrapper[4771]: I0319 15:41:55.191047 4771 scope.go:117] "RemoveContainer" containerID="10cd27af851acc4fdf348971ca1d9cea1c90bae14f8703a63099e2307768de01" Mar 19 15:41:55 crc kubenswrapper[4771]: E0319 15:41:55.191568 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:41:57 crc kubenswrapper[4771]: I0319 15:41:57.339711 4771 scope.go:117] "RemoveContainer" containerID="f19f47f7621886863d38d7f0bcdea4d8605c7b62ed90a4160cbba9353ac304a2" Mar 19 15:41:57 crc kubenswrapper[4771]: I0319 15:41:57.366216 4771 scope.go:117] "RemoveContainer" containerID="dfee18827bb3d9a90d2cbe70a5ad81a401b0cfbc305a2adacf8eccca2a8c2acf" Mar 19 15:41:57 crc kubenswrapper[4771]: I0319 15:41:57.422114 4771 scope.go:117] "RemoveContainer" containerID="2a93cacee0d6856fe3c176fc688b030feb95f2cc7ac9b364893de25c5997cc63" Mar 19 15:41:57 crc kubenswrapper[4771]: I0319 15:41:57.471562 4771 scope.go:117] "RemoveContainer" containerID="1c02f0a334d1c859ae293acb2f35e7f9444215760af8f44efe70337cb9a936dc" Mar 19 15:41:57 crc kubenswrapper[4771]: I0319 15:41:57.499217 4771 scope.go:117] "RemoveContainer" containerID="74db2422ce5d601da90ad2006b630bc77879d8f86b62fd4c7b4adc48c3de604d" Mar 19 15:41:57 crc kubenswrapper[4771]: I0319 15:41:57.552033 4771 scope.go:117] "RemoveContainer" containerID="f7d38bd03d7a2891b81764d6f808df090460386a76acc1db6fd8eb781b49f624" Mar 19 15:41:59 crc kubenswrapper[4771]: I0319 15:41:59.235102 4771 generic.go:334] "Generic (PLEG): container finished" podID="c065c328-37e2-4905-9d1e-82208eab196e" containerID="ae049e51f0eb6aef961237eaaf1901a05adc3ea2bc6e58c7885aff01dfec9f67" exitCode=0 Mar 19 15:41:59 crc kubenswrapper[4771]: I0319 15:41:59.235253 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c065c328-37e2-4905-9d1e-82208eab196e","Type":"ContainerDied","Data":"ae049e51f0eb6aef961237eaaf1901a05adc3ea2bc6e58c7885aff01dfec9f67"} Mar 19 15:41:59 crc kubenswrapper[4771]: I0319 15:41:59.235582 4771 scope.go:117] "RemoveContainer" containerID="36d524adff342ba72b9364ccb05183dddad5974cccba9a6d3b848393efb41b19" Mar 19 15:41:59 crc kubenswrapper[4771]: I0319 15:41:59.236760 4771 scope.go:117] "RemoveContainer" containerID="ae049e51f0eb6aef961237eaaf1901a05adc3ea2bc6e58c7885aff01dfec9f67" Mar 19 15:41:59 crc kubenswrapper[4771]: E0319 15:41:59.237315 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:42:00 crc kubenswrapper[4771]: I0319 15:42:00.159065 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565582-x76kx"] Mar 19 15:42:00 crc kubenswrapper[4771]: E0319 15:42:00.159637 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976" containerName="registry-server" Mar 19 15:42:00 crc kubenswrapper[4771]: I0319 15:42:00.159663 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976" containerName="registry-server" Mar 19 15:42:00 crc kubenswrapper[4771]: E0319 15:42:00.159694 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4fd828f-9d29-435f-8950-3bc663193824" containerName="extract-content" Mar 19 15:42:00 crc kubenswrapper[4771]: I0319 15:42:00.159705 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4fd828f-9d29-435f-8950-3bc663193824" containerName="extract-content" Mar 19 15:42:00 crc kubenswrapper[4771]: E0319 15:42:00.159723 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976" containerName="extract-utilities" Mar 19 15:42:00 crc kubenswrapper[4771]: I0319 15:42:00.159736 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976" containerName="extract-utilities" Mar 19 15:42:00 crc kubenswrapper[4771]: E0319 15:42:00.159754 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976" containerName="extract-content" Mar 19 15:42:00 crc kubenswrapper[4771]: I0319 15:42:00.159764 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976" containerName="extract-content" Mar 19 15:42:00 crc kubenswrapper[4771]: E0319 15:42:00.159776 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4fd828f-9d29-435f-8950-3bc663193824" containerName="registry-server" Mar 19 15:42:00 crc kubenswrapper[4771]: I0319 15:42:00.159785 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4fd828f-9d29-435f-8950-3bc663193824" containerName="registry-server" Mar 19 15:42:00 crc kubenswrapper[4771]: E0319 15:42:00.159803 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4fd828f-9d29-435f-8950-3bc663193824" containerName="extract-utilities" Mar 19 15:42:00 crc kubenswrapper[4771]: I0319 15:42:00.159814 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4fd828f-9d29-435f-8950-3bc663193824" containerName="extract-utilities" Mar 19 15:42:00 crc kubenswrapper[4771]: I0319 15:42:00.160147 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4fd828f-9d29-435f-8950-3bc663193824" containerName="registry-server" Mar 19 15:42:00 crc kubenswrapper[4771]: I0319 15:42:00.160207 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fc170ab-e6ec-4cbc-a1b9-d11ccb72a976" containerName="registry-server" Mar 19 15:42:00 crc kubenswrapper[4771]: I0319 15:42:00.161064 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565582-x76kx" Mar 19 15:42:00 crc kubenswrapper[4771]: I0319 15:42:00.163360 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 15:42:00 crc kubenswrapper[4771]: I0319 15:42:00.163686 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k42k7" Mar 19 15:42:00 crc kubenswrapper[4771]: I0319 15:42:00.164971 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 15:42:00 crc kubenswrapper[4771]: I0319 15:42:00.167769 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565582-x76kx"] Mar 19 15:42:00 crc kubenswrapper[4771]: I0319 15:42:00.551631 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p54h4\" (UniqueName: \"kubernetes.io/projected/7f5e980b-fada-4009-926b-30e270e6b7bb-kube-api-access-p54h4\") pod \"auto-csr-approver-29565582-x76kx\" (UID: \"7f5e980b-fada-4009-926b-30e270e6b7bb\") " pod="openshift-infra/auto-csr-approver-29565582-x76kx" Mar 19 15:42:00 crc kubenswrapper[4771]: I0319 15:42:00.653834 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p54h4\" (UniqueName: \"kubernetes.io/projected/7f5e980b-fada-4009-926b-30e270e6b7bb-kube-api-access-p54h4\") pod \"auto-csr-approver-29565582-x76kx\" (UID: \"7f5e980b-fada-4009-926b-30e270e6b7bb\") " pod="openshift-infra/auto-csr-approver-29565582-x76kx" Mar 19 15:42:00 crc kubenswrapper[4771]: I0319 15:42:00.674939 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p54h4\" (UniqueName: \"kubernetes.io/projected/7f5e980b-fada-4009-926b-30e270e6b7bb-kube-api-access-p54h4\") pod \"auto-csr-approver-29565582-x76kx\" (UID: \"7f5e980b-fada-4009-926b-30e270e6b7bb\") " pod="openshift-infra/auto-csr-approver-29565582-x76kx" Mar 19 15:42:00 crc kubenswrapper[4771]: I0319 15:42:00.859100 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565582-x76kx" Mar 19 15:42:01 crc kubenswrapper[4771]: I0319 15:42:01.290572 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565582-x76kx"] Mar 19 15:42:01 crc kubenswrapper[4771]: W0319 15:42:01.297279 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f5e980b_fada_4009_926b_30e270e6b7bb.slice/crio-6f26f5caa217ad3d8df8fb245c9533983e94910467905791f6c055710410311f WatchSource:0}: Error finding container 6f26f5caa217ad3d8df8fb245c9533983e94910467905791f6c055710410311f: Status 404 returned error can't find the container with id 6f26f5caa217ad3d8df8fb245c9533983e94910467905791f6c055710410311f Mar 19 15:42:01 crc kubenswrapper[4771]: I0319 15:42:01.557697 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565582-x76kx" event={"ID":"7f5e980b-fada-4009-926b-30e270e6b7bb","Type":"ContainerStarted","Data":"6f26f5caa217ad3d8df8fb245c9533983e94910467905791f6c055710410311f"} Mar 19 15:42:03 crc kubenswrapper[4771]: I0319 15:42:03.584662 4771 generic.go:334] "Generic (PLEG): container finished" podID="7f5e980b-fada-4009-926b-30e270e6b7bb" containerID="1246bfeafc5a6ea6bed8fa76649b760f812907cb48d823008a79b7b4269e59fa" exitCode=0 Mar 19 15:42:03 crc kubenswrapper[4771]: I0319 15:42:03.584746 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565582-x76kx" event={"ID":"7f5e980b-fada-4009-926b-30e270e6b7bb","Type":"ContainerDied","Data":"1246bfeafc5a6ea6bed8fa76649b760f812907cb48d823008a79b7b4269e59fa"} Mar 19 15:42:04 crc kubenswrapper[4771]: I0319 15:42:04.935157 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565582-x76kx" Mar 19 15:42:05 crc kubenswrapper[4771]: I0319 15:42:05.125494 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p54h4\" (UniqueName: \"kubernetes.io/projected/7f5e980b-fada-4009-926b-30e270e6b7bb-kube-api-access-p54h4\") pod \"7f5e980b-fada-4009-926b-30e270e6b7bb\" (UID: \"7f5e980b-fada-4009-926b-30e270e6b7bb\") " Mar 19 15:42:05 crc kubenswrapper[4771]: I0319 15:42:05.135398 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f5e980b-fada-4009-926b-30e270e6b7bb-kube-api-access-p54h4" (OuterVolumeSpecName: "kube-api-access-p54h4") pod "7f5e980b-fada-4009-926b-30e270e6b7bb" (UID: "7f5e980b-fada-4009-926b-30e270e6b7bb"). InnerVolumeSpecName "kube-api-access-p54h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:42:05 crc kubenswrapper[4771]: I0319 15:42:05.227869 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p54h4\" (UniqueName: \"kubernetes.io/projected/7f5e980b-fada-4009-926b-30e270e6b7bb-kube-api-access-p54h4\") on node \"crc\" DevicePath \"\"" Mar 19 15:42:05 crc kubenswrapper[4771]: I0319 15:42:05.609626 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565582-x76kx" event={"ID":"7f5e980b-fada-4009-926b-30e270e6b7bb","Type":"ContainerDied","Data":"6f26f5caa217ad3d8df8fb245c9533983e94910467905791f6c055710410311f"} Mar 19 15:42:05 crc kubenswrapper[4771]: I0319 15:42:05.609909 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f26f5caa217ad3d8df8fb245c9533983e94910467905791f6c055710410311f" Mar 19 15:42:05 crc kubenswrapper[4771]: I0319 15:42:05.609708 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565582-x76kx" Mar 19 15:42:06 crc kubenswrapper[4771]: I0319 15:42:06.040141 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565576-4rhn7"] Mar 19 15:42:06 crc kubenswrapper[4771]: I0319 15:42:06.041382 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565576-4rhn7"] Mar 19 15:42:07 crc kubenswrapper[4771]: I0319 15:42:07.527835 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="523fdd36-adcb-4bde-9ba4-d0f5a9560caa" path="/var/lib/kubelet/pods/523fdd36-adcb-4bde-9ba4-d0f5a9560caa/volumes" Mar 19 15:42:09 crc kubenswrapper[4771]: I0319 15:42:09.509177 4771 scope.go:117] "RemoveContainer" containerID="10cd27af851acc4fdf348971ca1d9cea1c90bae14f8703a63099e2307768de01" Mar 19 15:42:09 crc kubenswrapper[4771]: E0319 15:42:09.511263 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:42:10 crc kubenswrapper[4771]: I0319 15:42:10.509336 4771 scope.go:117] "RemoveContainer" containerID="ae049e51f0eb6aef961237eaaf1901a05adc3ea2bc6e58c7885aff01dfec9f67" Mar 19 15:42:10 crc kubenswrapper[4771]: E0319 15:42:10.509902 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:42:20 crc kubenswrapper[4771]: I0319 15:42:20.509053 4771 scope.go:117] "RemoveContainer" containerID="10cd27af851acc4fdf348971ca1d9cea1c90bae14f8703a63099e2307768de01" Mar 19 15:42:20 crc kubenswrapper[4771]: E0319 15:42:20.509923 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:42:23 crc kubenswrapper[4771]: I0319 15:42:23.027204 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:42:23 crc kubenswrapper[4771]: I0319 15:42:23.027601 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:42:23 crc kubenswrapper[4771]: I0319 15:42:23.027654 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" Mar 19 15:42:23 crc kubenswrapper[4771]: I0319 15:42:23.028398 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04"} pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 15:42:23 crc kubenswrapper[4771]: I0319 15:42:23.028524 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" containerID="cri-o://a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04" gracePeriod=600 Mar 19 15:42:23 crc kubenswrapper[4771]: E0319 15:42:23.160876 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:42:23 crc kubenswrapper[4771]: I0319 15:42:23.508920 4771 scope.go:117] "RemoveContainer" containerID="ae049e51f0eb6aef961237eaaf1901a05adc3ea2bc6e58c7885aff01dfec9f67" Mar 19 15:42:23 crc kubenswrapper[4771]: E0319 15:42:23.509215 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:42:23 crc kubenswrapper[4771]: I0319 15:42:23.787476 4771 generic.go:334] "Generic (PLEG): container finished" podID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerID="a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04" exitCode=0 Mar 19 15:42:23 crc kubenswrapper[4771]: I0319 15:42:23.787550 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" event={"ID":"f2b6e948-bbef-4217-b0eb-4cdbf711037c","Type":"ContainerDied","Data":"a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04"} Mar 19 15:42:23 crc kubenswrapper[4771]: I0319 15:42:23.787875 4771 scope.go:117] "RemoveContainer" containerID="9a7f9aaa7059458b50170d0fb5711aac81f8c848fea74597bd5d13bc3f12edd3" Mar 19 15:42:23 crc kubenswrapper[4771]: I0319 15:42:23.788479 4771 scope.go:117] "RemoveContainer" containerID="a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04" Mar 19 15:42:23 crc kubenswrapper[4771]: E0319 15:42:23.788770 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:42:34 crc kubenswrapper[4771]: I0319 15:42:34.508962 4771 scope.go:117] "RemoveContainer" containerID="ae049e51f0eb6aef961237eaaf1901a05adc3ea2bc6e58c7885aff01dfec9f67" Mar 19 15:42:34 crc kubenswrapper[4771]: I0319 15:42:34.509549 4771 scope.go:117] "RemoveContainer" containerID="10cd27af851acc4fdf348971ca1d9cea1c90bae14f8703a63099e2307768de01" Mar 19 15:42:34 crc kubenswrapper[4771]: E0319 15:42:34.509786 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:42:34 crc kubenswrapper[4771]: E0319 15:42:34.509786 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:42:39 crc kubenswrapper[4771]: I0319 15:42:39.508596 4771 scope.go:117] "RemoveContainer" containerID="a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04" Mar 19 15:42:39 crc kubenswrapper[4771]: E0319 15:42:39.509508 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:42:45 crc kubenswrapper[4771]: I0319 15:42:45.510553 4771 scope.go:117] "RemoveContainer" containerID="ae049e51f0eb6aef961237eaaf1901a05adc3ea2bc6e58c7885aff01dfec9f67" Mar 19 15:42:45 crc kubenswrapper[4771]: E0319 15:42:45.511686 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:42:48 crc kubenswrapper[4771]: I0319 15:42:48.508966 4771 scope.go:117] "RemoveContainer" containerID="10cd27af851acc4fdf348971ca1d9cea1c90bae14f8703a63099e2307768de01" Mar 19 15:42:48 crc kubenswrapper[4771]: E0319 15:42:48.509632 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:42:50 crc kubenswrapper[4771]: I0319 15:42:50.510319 4771 scope.go:117] "RemoveContainer" containerID="a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04" Mar 19 15:42:50 crc kubenswrapper[4771]: E0319 15:42:50.511876 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:42:56 crc kubenswrapper[4771]: I0319 15:42:56.508466 4771 scope.go:117] "RemoveContainer" containerID="ae049e51f0eb6aef961237eaaf1901a05adc3ea2bc6e58c7885aff01dfec9f67" Mar 19 15:42:56 crc kubenswrapper[4771]: E0319 15:42:56.509299 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:42:57 crc kubenswrapper[4771]: I0319 15:42:57.667440 4771 scope.go:117] "RemoveContainer" containerID="918da1980407aa4504720ade6b61ebfc02466296fe98aed6dfcced04a6dbca69" Mar 19 15:43:00 crc kubenswrapper[4771]: I0319 15:43:00.509517 4771 scope.go:117] "RemoveContainer" containerID="10cd27af851acc4fdf348971ca1d9cea1c90bae14f8703a63099e2307768de01" Mar 19 15:43:00 crc kubenswrapper[4771]: E0319 15:43:00.510409 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:43:02 crc kubenswrapper[4771]: I0319 15:43:02.509493 4771 scope.go:117] "RemoveContainer" containerID="a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04" Mar 19 15:43:02 crc kubenswrapper[4771]: E0319 15:43:02.511056 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:43:08 crc kubenswrapper[4771]: I0319 15:43:08.508596 4771 scope.go:117] "RemoveContainer" containerID="ae049e51f0eb6aef961237eaaf1901a05adc3ea2bc6e58c7885aff01dfec9f67" Mar 19 15:43:08 crc kubenswrapper[4771]: E0319 15:43:08.509682 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:43:12 crc kubenswrapper[4771]: I0319 15:43:12.509148 4771 scope.go:117] "RemoveContainer" containerID="10cd27af851acc4fdf348971ca1d9cea1c90bae14f8703a63099e2307768de01" Mar 19 15:43:12 crc kubenswrapper[4771]: E0319 15:43:12.509406 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:43:15 crc kubenswrapper[4771]: I0319 15:43:15.509256 4771 scope.go:117] "RemoveContainer" containerID="a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04" Mar 19 15:43:15 crc kubenswrapper[4771]: E0319 15:43:15.509772 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:43:22 crc kubenswrapper[4771]: I0319 15:43:22.508951 4771 scope.go:117] "RemoveContainer" containerID="ae049e51f0eb6aef961237eaaf1901a05adc3ea2bc6e58c7885aff01dfec9f67" Mar 19 15:43:22 crc kubenswrapper[4771]: E0319 15:43:22.510123 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:43:27 crc kubenswrapper[4771]: I0319 15:43:27.509073 4771 scope.go:117] "RemoveContainer" containerID="10cd27af851acc4fdf348971ca1d9cea1c90bae14f8703a63099e2307768de01" Mar 19 15:43:27 crc kubenswrapper[4771]: E0319 15:43:27.509962 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:43:30 crc kubenswrapper[4771]: I0319 15:43:30.509535 4771 scope.go:117] "RemoveContainer" containerID="a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04" Mar 19 15:43:30 crc kubenswrapper[4771]: E0319 15:43:30.510806 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:43:36 crc kubenswrapper[4771]: I0319 15:43:36.508906 4771 scope.go:117] "RemoveContainer" containerID="ae049e51f0eb6aef961237eaaf1901a05adc3ea2bc6e58c7885aff01dfec9f67" Mar 19 15:43:36 crc kubenswrapper[4771]: E0319 15:43:36.509736 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:43:42 crc kubenswrapper[4771]: I0319 15:43:42.509389 4771 scope.go:117] "RemoveContainer" containerID="10cd27af851acc4fdf348971ca1d9cea1c90bae14f8703a63099e2307768de01" Mar 19 15:43:42 crc kubenswrapper[4771]: E0319 15:43:42.510152 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:43:44 crc kubenswrapper[4771]: I0319 15:43:44.508512 4771 scope.go:117] "RemoveContainer" containerID="a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04" Mar 19 15:43:44 crc kubenswrapper[4771]: E0319 15:43:44.509091 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:43:49 crc kubenswrapper[4771]: I0319 15:43:49.508839 4771 scope.go:117] "RemoveContainer" containerID="ae049e51f0eb6aef961237eaaf1901a05adc3ea2bc6e58c7885aff01dfec9f67" Mar 19 15:43:49 crc kubenswrapper[4771]: E0319 15:43:49.509952 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:43:56 crc kubenswrapper[4771]: I0319 15:43:56.509910 4771 scope.go:117] "RemoveContainer" containerID="10cd27af851acc4fdf348971ca1d9cea1c90bae14f8703a63099e2307768de01" Mar 19 15:43:56 crc kubenswrapper[4771]: E0319 15:43:56.511194 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:43:59 crc kubenswrapper[4771]: I0319 15:43:59.512867 4771 scope.go:117] "RemoveContainer" containerID="a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04" Mar 19 15:43:59 crc kubenswrapper[4771]: E0319 15:43:59.513696 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:44:00 crc kubenswrapper[4771]: I0319 15:44:00.166639 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565584-rnpv2"] Mar 19 15:44:00 crc kubenswrapper[4771]: E0319 15:44:00.167289 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f5e980b-fada-4009-926b-30e270e6b7bb" containerName="oc" Mar 19 15:44:00 crc kubenswrapper[4771]: I0319 15:44:00.167317 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f5e980b-fada-4009-926b-30e270e6b7bb" containerName="oc" Mar 19 15:44:00 crc kubenswrapper[4771]: I0319 15:44:00.167771 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f5e980b-fada-4009-926b-30e270e6b7bb" containerName="oc" Mar 19 15:44:00 crc kubenswrapper[4771]: I0319 15:44:00.168849 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565584-rnpv2" Mar 19 15:44:00 crc kubenswrapper[4771]: I0319 15:44:00.173090 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k42k7" Mar 19 15:44:00 crc kubenswrapper[4771]: I0319 15:44:00.173222 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 15:44:00 crc kubenswrapper[4771]: I0319 15:44:00.173389 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 15:44:00 crc kubenswrapper[4771]: I0319 15:44:00.177352 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565584-rnpv2"] Mar 19 15:44:00 crc kubenswrapper[4771]: I0319 15:44:00.290321 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f7vb\" (UniqueName: \"kubernetes.io/projected/54c3c4d7-90d4-4e95-8133-90a9e91a975f-kube-api-access-9f7vb\") pod \"auto-csr-approver-29565584-rnpv2\" (UID: \"54c3c4d7-90d4-4e95-8133-90a9e91a975f\") " pod="openshift-infra/auto-csr-approver-29565584-rnpv2" Mar 19 15:44:00 crc kubenswrapper[4771]: I0319 15:44:00.391638 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f7vb\" (UniqueName: \"kubernetes.io/projected/54c3c4d7-90d4-4e95-8133-90a9e91a975f-kube-api-access-9f7vb\") pod \"auto-csr-approver-29565584-rnpv2\" (UID: \"54c3c4d7-90d4-4e95-8133-90a9e91a975f\") " pod="openshift-infra/auto-csr-approver-29565584-rnpv2" Mar 19 15:44:00 crc kubenswrapper[4771]: I0319 15:44:00.413268 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f7vb\" (UniqueName: \"kubernetes.io/projected/54c3c4d7-90d4-4e95-8133-90a9e91a975f-kube-api-access-9f7vb\") pod \"auto-csr-approver-29565584-rnpv2\" (UID: \"54c3c4d7-90d4-4e95-8133-90a9e91a975f\") " pod="openshift-infra/auto-csr-approver-29565584-rnpv2" Mar 19 15:44:00 crc kubenswrapper[4771]: I0319 15:44:00.495634 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565584-rnpv2" Mar 19 15:44:01 crc kubenswrapper[4771]: I0319 15:44:01.001224 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565584-rnpv2"] Mar 19 15:44:01 crc kubenswrapper[4771]: W0319 15:44:01.001863 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54c3c4d7_90d4_4e95_8133_90a9e91a975f.slice/crio-c99622e5f2639352923fcc9e90bf07e914ce298046d51ab20c9dfd5f76970397 WatchSource:0}: Error finding container c99622e5f2639352923fcc9e90bf07e914ce298046d51ab20c9dfd5f76970397: Status 404 returned error can't find the container with id c99622e5f2639352923fcc9e90bf07e914ce298046d51ab20c9dfd5f76970397 Mar 19 15:44:01 crc kubenswrapper[4771]: I0319 15:44:01.006513 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 15:44:01 crc kubenswrapper[4771]: I0319 15:44:01.622193 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565584-rnpv2" event={"ID":"54c3c4d7-90d4-4e95-8133-90a9e91a975f","Type":"ContainerStarted","Data":"c99622e5f2639352923fcc9e90bf07e914ce298046d51ab20c9dfd5f76970397"} Mar 19 15:44:02 crc kubenswrapper[4771]: I0319 15:44:02.508708 4771 scope.go:117] "RemoveContainer" containerID="ae049e51f0eb6aef961237eaaf1901a05adc3ea2bc6e58c7885aff01dfec9f67" Mar 19 15:44:02 crc kubenswrapper[4771]: E0319 15:44:02.510341 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:44:02 crc kubenswrapper[4771]: I0319 15:44:02.630049 4771 generic.go:334] "Generic (PLEG): container finished" podID="54c3c4d7-90d4-4e95-8133-90a9e91a975f" containerID="eb9a9d31b336a99e462320135f148cfd4d99361607881e25165c6509dfec7626" exitCode=0 Mar 19 15:44:02 crc kubenswrapper[4771]: I0319 15:44:02.630106 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565584-rnpv2" event={"ID":"54c3c4d7-90d4-4e95-8133-90a9e91a975f","Type":"ContainerDied","Data":"eb9a9d31b336a99e462320135f148cfd4d99361607881e25165c6509dfec7626"} Mar 19 15:44:03 crc kubenswrapper[4771]: I0319 15:44:03.940086 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565584-rnpv2" Mar 19 15:44:04 crc kubenswrapper[4771]: I0319 15:44:04.030944 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6tr7n"] Mar 19 15:44:04 crc kubenswrapper[4771]: E0319 15:44:04.031382 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54c3c4d7-90d4-4e95-8133-90a9e91a975f" containerName="oc" Mar 19 15:44:04 crc kubenswrapper[4771]: I0319 15:44:04.031406 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="54c3c4d7-90d4-4e95-8133-90a9e91a975f" containerName="oc" Mar 19 15:44:04 crc kubenswrapper[4771]: I0319 15:44:04.031610 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="54c3c4d7-90d4-4e95-8133-90a9e91a975f" containerName="oc" Mar 19 15:44:04 crc kubenswrapper[4771]: I0319 15:44:04.033039 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tr7n" Mar 19 15:44:04 crc kubenswrapper[4771]: I0319 15:44:04.047733 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6tr7n"] Mar 19 15:44:04 crc kubenswrapper[4771]: I0319 15:44:04.061796 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f7vb\" (UniqueName: \"kubernetes.io/projected/54c3c4d7-90d4-4e95-8133-90a9e91a975f-kube-api-access-9f7vb\") pod \"54c3c4d7-90d4-4e95-8133-90a9e91a975f\" (UID: \"54c3c4d7-90d4-4e95-8133-90a9e91a975f\") " Mar 19 15:44:04 crc kubenswrapper[4771]: I0319 15:44:04.067412 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54c3c4d7-90d4-4e95-8133-90a9e91a975f-kube-api-access-9f7vb" (OuterVolumeSpecName: "kube-api-access-9f7vb") pod "54c3c4d7-90d4-4e95-8133-90a9e91a975f" (UID: "54c3c4d7-90d4-4e95-8133-90a9e91a975f"). InnerVolumeSpecName "kube-api-access-9f7vb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:44:04 crc kubenswrapper[4771]: I0319 15:44:04.163453 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lvqp\" (UniqueName: \"kubernetes.io/projected/8ee164a3-fd93-4e47-91b5-d28ead28101b-kube-api-access-6lvqp\") pod \"certified-operators-6tr7n\" (UID: \"8ee164a3-fd93-4e47-91b5-d28ead28101b\") " pod="openshift-marketplace/certified-operators-6tr7n" Mar 19 15:44:04 crc kubenswrapper[4771]: I0319 15:44:04.163501 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ee164a3-fd93-4e47-91b5-d28ead28101b-utilities\") pod \"certified-operators-6tr7n\" (UID: \"8ee164a3-fd93-4e47-91b5-d28ead28101b\") " pod="openshift-marketplace/certified-operators-6tr7n" Mar 19 15:44:04 crc kubenswrapper[4771]: I0319 15:44:04.163523 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ee164a3-fd93-4e47-91b5-d28ead28101b-catalog-content\") pod \"certified-operators-6tr7n\" (UID: \"8ee164a3-fd93-4e47-91b5-d28ead28101b\") " pod="openshift-marketplace/certified-operators-6tr7n" Mar 19 15:44:04 crc kubenswrapper[4771]: I0319 15:44:04.163682 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9f7vb\" (UniqueName: \"kubernetes.io/projected/54c3c4d7-90d4-4e95-8133-90a9e91a975f-kube-api-access-9f7vb\") on node \"crc\" DevicePath \"\"" Mar 19 15:44:04 crc kubenswrapper[4771]: I0319 15:44:04.264752 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lvqp\" (UniqueName: \"kubernetes.io/projected/8ee164a3-fd93-4e47-91b5-d28ead28101b-kube-api-access-6lvqp\") pod \"certified-operators-6tr7n\" (UID: \"8ee164a3-fd93-4e47-91b5-d28ead28101b\") " pod="openshift-marketplace/certified-operators-6tr7n" Mar 19 15:44:04 crc kubenswrapper[4771]: I0319 15:44:04.264808 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ee164a3-fd93-4e47-91b5-d28ead28101b-utilities\") pod \"certified-operators-6tr7n\" (UID: \"8ee164a3-fd93-4e47-91b5-d28ead28101b\") " pod="openshift-marketplace/certified-operators-6tr7n" Mar 19 15:44:04 crc kubenswrapper[4771]: I0319 15:44:04.264831 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ee164a3-fd93-4e47-91b5-d28ead28101b-catalog-content\") pod \"certified-operators-6tr7n\" (UID: \"8ee164a3-fd93-4e47-91b5-d28ead28101b\") " pod="openshift-marketplace/certified-operators-6tr7n" Mar 19 15:44:04 crc kubenswrapper[4771]: I0319 15:44:04.265479 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ee164a3-fd93-4e47-91b5-d28ead28101b-catalog-content\") pod \"certified-operators-6tr7n\" (UID: \"8ee164a3-fd93-4e47-91b5-d28ead28101b\") " pod="openshift-marketplace/certified-operators-6tr7n" Mar 19 15:44:04 crc kubenswrapper[4771]: I0319 15:44:04.265508 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ee164a3-fd93-4e47-91b5-d28ead28101b-utilities\") pod \"certified-operators-6tr7n\" (UID: \"8ee164a3-fd93-4e47-91b5-d28ead28101b\") " pod="openshift-marketplace/certified-operators-6tr7n" Mar 19 15:44:04 crc kubenswrapper[4771]: I0319 15:44:04.288095 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lvqp\" (UniqueName: \"kubernetes.io/projected/8ee164a3-fd93-4e47-91b5-d28ead28101b-kube-api-access-6lvqp\") pod \"certified-operators-6tr7n\" (UID: \"8ee164a3-fd93-4e47-91b5-d28ead28101b\") " pod="openshift-marketplace/certified-operators-6tr7n" Mar 19 15:44:04 crc kubenswrapper[4771]: I0319 15:44:04.353290 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tr7n" Mar 19 15:44:04 crc kubenswrapper[4771]: I0319 15:44:04.654495 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565584-rnpv2" event={"ID":"54c3c4d7-90d4-4e95-8133-90a9e91a975f","Type":"ContainerDied","Data":"c99622e5f2639352923fcc9e90bf07e914ce298046d51ab20c9dfd5f76970397"} Mar 19 15:44:04 crc kubenswrapper[4771]: I0319 15:44:04.654537 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c99622e5f2639352923fcc9e90bf07e914ce298046d51ab20c9dfd5f76970397" Mar 19 15:44:04 crc kubenswrapper[4771]: I0319 15:44:04.654587 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565584-rnpv2" Mar 19 15:44:04 crc kubenswrapper[4771]: I0319 15:44:04.837481 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6tr7n"] Mar 19 15:44:04 crc kubenswrapper[4771]: W0319 15:44:04.839719 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ee164a3_fd93_4e47_91b5_d28ead28101b.slice/crio-c9e7c6978ff2bf765bff2caf7dad276d9ac854458a508c404ba7c0c486be9ca8 WatchSource:0}: Error finding container c9e7c6978ff2bf765bff2caf7dad276d9ac854458a508c404ba7c0c486be9ca8: Status 404 returned error can't find the container with id c9e7c6978ff2bf765bff2caf7dad276d9ac854458a508c404ba7c0c486be9ca8 Mar 19 15:44:05 crc kubenswrapper[4771]: I0319 15:44:05.005021 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565578-bmw69"] Mar 19 15:44:05 crc kubenswrapper[4771]: I0319 15:44:05.012525 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565578-bmw69"] Mar 19 15:44:05 crc kubenswrapper[4771]: I0319 15:44:05.521338 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae9e370e-3985-4fb9-83fd-b080b1c22ca0" path="/var/lib/kubelet/pods/ae9e370e-3985-4fb9-83fd-b080b1c22ca0/volumes" Mar 19 15:44:05 crc kubenswrapper[4771]: I0319 15:44:05.662722 4771 generic.go:334] "Generic (PLEG): container finished" podID="8ee164a3-fd93-4e47-91b5-d28ead28101b" containerID="ae0a58fffc2dcc3a5c28f38d90e1ce7b204dd913fe689fd6816493541b10d36d" exitCode=0 Mar 19 15:44:05 crc kubenswrapper[4771]: I0319 15:44:05.662794 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tr7n" event={"ID":"8ee164a3-fd93-4e47-91b5-d28ead28101b","Type":"ContainerDied","Data":"ae0a58fffc2dcc3a5c28f38d90e1ce7b204dd913fe689fd6816493541b10d36d"} Mar 19 15:44:05 crc kubenswrapper[4771]: I0319 15:44:05.662870 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tr7n" event={"ID":"8ee164a3-fd93-4e47-91b5-d28ead28101b","Type":"ContainerStarted","Data":"c9e7c6978ff2bf765bff2caf7dad276d9ac854458a508c404ba7c0c486be9ca8"} Mar 19 15:44:07 crc kubenswrapper[4771]: I0319 15:44:07.681608 4771 generic.go:334] "Generic (PLEG): container finished" podID="8ee164a3-fd93-4e47-91b5-d28ead28101b" containerID="119d8ea1d08fcfb3a507451e6d6d0e55ec08dec0d6bda6f6ed507966f3ba9dc8" exitCode=0 Mar 19 15:44:07 crc kubenswrapper[4771]: I0319 15:44:07.681695 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tr7n" event={"ID":"8ee164a3-fd93-4e47-91b5-d28ead28101b","Type":"ContainerDied","Data":"119d8ea1d08fcfb3a507451e6d6d0e55ec08dec0d6bda6f6ed507966f3ba9dc8"} Mar 19 15:44:08 crc kubenswrapper[4771]: I0319 15:44:08.695345 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tr7n" event={"ID":"8ee164a3-fd93-4e47-91b5-d28ead28101b","Type":"ContainerStarted","Data":"b0c8d5648ffddb33a831eac687a4b88b31501aa7dc5e23b410374970cbb8fd12"} Mar 19 15:44:08 crc kubenswrapper[4771]: I0319 15:44:08.725202 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6tr7n" podStartSLOduration=2.251563191 podStartE2EDuration="4.725175036s" podCreationTimestamp="2026-03-19 15:44:04 +0000 UTC" firstStartedPulling="2026-03-19 15:44:05.664146487 +0000 UTC m=+1704.892767699" lastFinishedPulling="2026-03-19 15:44:08.137758342 +0000 UTC m=+1707.366379544" observedRunningTime="2026-03-19 15:44:08.722934951 +0000 UTC m=+1707.951556253" watchObservedRunningTime="2026-03-19 15:44:08.725175036 +0000 UTC m=+1707.953796278" Mar 19 15:44:10 crc kubenswrapper[4771]: I0319 15:44:10.509415 4771 scope.go:117] "RemoveContainer" containerID="10cd27af851acc4fdf348971ca1d9cea1c90bae14f8703a63099e2307768de01" Mar 19 15:44:10 crc kubenswrapper[4771]: E0319 15:44:10.509755 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:44:14 crc kubenswrapper[4771]: I0319 15:44:14.353463 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6tr7n" Mar 19 15:44:14 crc kubenswrapper[4771]: I0319 15:44:14.353913 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6tr7n" Mar 19 15:44:14 crc kubenswrapper[4771]: I0319 15:44:14.397643 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6tr7n" Mar 19 15:44:14 crc kubenswrapper[4771]: I0319 15:44:14.508953 4771 scope.go:117] "RemoveContainer" containerID="a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04" Mar 19 15:44:14 crc kubenswrapper[4771]: E0319 15:44:14.509513 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:44:14 crc kubenswrapper[4771]: I0319 15:44:14.810032 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6tr7n" Mar 19 15:44:14 crc kubenswrapper[4771]: I0319 15:44:14.871360 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6tr7n"] Mar 19 15:44:15 crc kubenswrapper[4771]: I0319 15:44:15.509321 4771 scope.go:117] "RemoveContainer" containerID="ae049e51f0eb6aef961237eaaf1901a05adc3ea2bc6e58c7885aff01dfec9f67" Mar 19 15:44:15 crc kubenswrapper[4771]: E0319 15:44:15.509587 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:44:16 crc kubenswrapper[4771]: I0319 15:44:16.782427 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6tr7n" podUID="8ee164a3-fd93-4e47-91b5-d28ead28101b" containerName="registry-server" containerID="cri-o://b0c8d5648ffddb33a831eac687a4b88b31501aa7dc5e23b410374970cbb8fd12" gracePeriod=2 Mar 19 15:44:17 crc kubenswrapper[4771]: I0319 15:44:17.215713 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tr7n" Mar 19 15:44:17 crc kubenswrapper[4771]: I0319 15:44:17.326087 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lvqp\" (UniqueName: \"kubernetes.io/projected/8ee164a3-fd93-4e47-91b5-d28ead28101b-kube-api-access-6lvqp\") pod \"8ee164a3-fd93-4e47-91b5-d28ead28101b\" (UID: \"8ee164a3-fd93-4e47-91b5-d28ead28101b\") " Mar 19 15:44:17 crc kubenswrapper[4771]: I0319 15:44:17.326147 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ee164a3-fd93-4e47-91b5-d28ead28101b-catalog-content\") pod \"8ee164a3-fd93-4e47-91b5-d28ead28101b\" (UID: \"8ee164a3-fd93-4e47-91b5-d28ead28101b\") " Mar 19 15:44:17 crc kubenswrapper[4771]: I0319 15:44:17.326213 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ee164a3-fd93-4e47-91b5-d28ead28101b-utilities\") pod \"8ee164a3-fd93-4e47-91b5-d28ead28101b\" (UID: \"8ee164a3-fd93-4e47-91b5-d28ead28101b\") " Mar 19 15:44:17 crc kubenswrapper[4771]: I0319 15:44:17.327614 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ee164a3-fd93-4e47-91b5-d28ead28101b-utilities" (OuterVolumeSpecName: "utilities") pod "8ee164a3-fd93-4e47-91b5-d28ead28101b" (UID: "8ee164a3-fd93-4e47-91b5-d28ead28101b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:44:17 crc kubenswrapper[4771]: I0319 15:44:17.331707 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ee164a3-fd93-4e47-91b5-d28ead28101b-kube-api-access-6lvqp" (OuterVolumeSpecName: "kube-api-access-6lvqp") pod "8ee164a3-fd93-4e47-91b5-d28ead28101b" (UID: "8ee164a3-fd93-4e47-91b5-d28ead28101b"). InnerVolumeSpecName "kube-api-access-6lvqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:44:17 crc kubenswrapper[4771]: I0319 15:44:17.395376 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ee164a3-fd93-4e47-91b5-d28ead28101b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ee164a3-fd93-4e47-91b5-d28ead28101b" (UID: "8ee164a3-fd93-4e47-91b5-d28ead28101b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:44:17 crc kubenswrapper[4771]: I0319 15:44:17.428033 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lvqp\" (UniqueName: \"kubernetes.io/projected/8ee164a3-fd93-4e47-91b5-d28ead28101b-kube-api-access-6lvqp\") on node \"crc\" DevicePath \"\"" Mar 19 15:44:17 crc kubenswrapper[4771]: I0319 15:44:17.428059 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ee164a3-fd93-4e47-91b5-d28ead28101b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 15:44:17 crc kubenswrapper[4771]: I0319 15:44:17.428069 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ee164a3-fd93-4e47-91b5-d28ead28101b-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 15:44:17 crc kubenswrapper[4771]: I0319 15:44:17.793809 4771 generic.go:334] "Generic (PLEG): container finished" podID="8ee164a3-fd93-4e47-91b5-d28ead28101b" containerID="b0c8d5648ffddb33a831eac687a4b88b31501aa7dc5e23b410374970cbb8fd12" exitCode=0 Mar 19 15:44:17 crc kubenswrapper[4771]: I0319 15:44:17.793859 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tr7n" event={"ID":"8ee164a3-fd93-4e47-91b5-d28ead28101b","Type":"ContainerDied","Data":"b0c8d5648ffddb33a831eac687a4b88b31501aa7dc5e23b410374970cbb8fd12"} Mar 19 15:44:17 crc kubenswrapper[4771]: I0319 15:44:17.793898 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tr7n" event={"ID":"8ee164a3-fd93-4e47-91b5-d28ead28101b","Type":"ContainerDied","Data":"c9e7c6978ff2bf765bff2caf7dad276d9ac854458a508c404ba7c0c486be9ca8"} Mar 19 15:44:17 crc kubenswrapper[4771]: I0319 15:44:17.793897 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tr7n" Mar 19 15:44:17 crc kubenswrapper[4771]: I0319 15:44:17.793918 4771 scope.go:117] "RemoveContainer" containerID="b0c8d5648ffddb33a831eac687a4b88b31501aa7dc5e23b410374970cbb8fd12" Mar 19 15:44:17 crc kubenswrapper[4771]: I0319 15:44:17.833405 4771 scope.go:117] "RemoveContainer" containerID="119d8ea1d08fcfb3a507451e6d6d0e55ec08dec0d6bda6f6ed507966f3ba9dc8" Mar 19 15:44:17 crc kubenswrapper[4771]: I0319 15:44:17.833665 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6tr7n"] Mar 19 15:44:17 crc kubenswrapper[4771]: I0319 15:44:17.845626 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6tr7n"] Mar 19 15:44:17 crc kubenswrapper[4771]: I0319 15:44:17.853576 4771 scope.go:117] "RemoveContainer" containerID="ae0a58fffc2dcc3a5c28f38d90e1ce7b204dd913fe689fd6816493541b10d36d" Mar 19 15:44:17 crc kubenswrapper[4771]: I0319 15:44:17.909829 4771 scope.go:117] "RemoveContainer" containerID="b0c8d5648ffddb33a831eac687a4b88b31501aa7dc5e23b410374970cbb8fd12" Mar 19 15:44:17 crc kubenswrapper[4771]: E0319 15:44:17.910409 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0c8d5648ffddb33a831eac687a4b88b31501aa7dc5e23b410374970cbb8fd12\": container with ID starting with b0c8d5648ffddb33a831eac687a4b88b31501aa7dc5e23b410374970cbb8fd12 not found: ID does not exist" containerID="b0c8d5648ffddb33a831eac687a4b88b31501aa7dc5e23b410374970cbb8fd12" Mar 19 15:44:17 crc kubenswrapper[4771]: I0319 15:44:17.910459 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0c8d5648ffddb33a831eac687a4b88b31501aa7dc5e23b410374970cbb8fd12"} err="failed to get container status \"b0c8d5648ffddb33a831eac687a4b88b31501aa7dc5e23b410374970cbb8fd12\": rpc error: code = NotFound desc = could not find container \"b0c8d5648ffddb33a831eac687a4b88b31501aa7dc5e23b410374970cbb8fd12\": container with ID starting with b0c8d5648ffddb33a831eac687a4b88b31501aa7dc5e23b410374970cbb8fd12 not found: ID does not exist" Mar 19 15:44:17 crc kubenswrapper[4771]: I0319 15:44:17.910493 4771 scope.go:117] "RemoveContainer" containerID="119d8ea1d08fcfb3a507451e6d6d0e55ec08dec0d6bda6f6ed507966f3ba9dc8" Mar 19 15:44:17 crc kubenswrapper[4771]: E0319 15:44:17.910843 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"119d8ea1d08fcfb3a507451e6d6d0e55ec08dec0d6bda6f6ed507966f3ba9dc8\": container with ID starting with 119d8ea1d08fcfb3a507451e6d6d0e55ec08dec0d6bda6f6ed507966f3ba9dc8 not found: ID does not exist" containerID="119d8ea1d08fcfb3a507451e6d6d0e55ec08dec0d6bda6f6ed507966f3ba9dc8" Mar 19 15:44:17 crc kubenswrapper[4771]: I0319 15:44:17.910885 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"119d8ea1d08fcfb3a507451e6d6d0e55ec08dec0d6bda6f6ed507966f3ba9dc8"} err="failed to get container status \"119d8ea1d08fcfb3a507451e6d6d0e55ec08dec0d6bda6f6ed507966f3ba9dc8\": rpc error: code = NotFound desc = could not find container \"119d8ea1d08fcfb3a507451e6d6d0e55ec08dec0d6bda6f6ed507966f3ba9dc8\": container with ID starting with 119d8ea1d08fcfb3a507451e6d6d0e55ec08dec0d6bda6f6ed507966f3ba9dc8 not found: ID does not exist" Mar 19 15:44:17 crc kubenswrapper[4771]: I0319 15:44:17.910911 4771 scope.go:117] "RemoveContainer" containerID="ae0a58fffc2dcc3a5c28f38d90e1ce7b204dd913fe689fd6816493541b10d36d" Mar 19 15:44:17 crc kubenswrapper[4771]: E0319 15:44:17.911312 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae0a58fffc2dcc3a5c28f38d90e1ce7b204dd913fe689fd6816493541b10d36d\": container with ID starting with ae0a58fffc2dcc3a5c28f38d90e1ce7b204dd913fe689fd6816493541b10d36d not found: ID does not exist" containerID="ae0a58fffc2dcc3a5c28f38d90e1ce7b204dd913fe689fd6816493541b10d36d" Mar 19 15:44:17 crc kubenswrapper[4771]: I0319 15:44:17.911339 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae0a58fffc2dcc3a5c28f38d90e1ce7b204dd913fe689fd6816493541b10d36d"} err="failed to get container status \"ae0a58fffc2dcc3a5c28f38d90e1ce7b204dd913fe689fd6816493541b10d36d\": rpc error: code = NotFound desc = could not find container \"ae0a58fffc2dcc3a5c28f38d90e1ce7b204dd913fe689fd6816493541b10d36d\": container with ID starting with ae0a58fffc2dcc3a5c28f38d90e1ce7b204dd913fe689fd6816493541b10d36d not found: ID does not exist" Mar 19 15:44:19 crc kubenswrapper[4771]: I0319 15:44:19.524053 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ee164a3-fd93-4e47-91b5-d28ead28101b" path="/var/lib/kubelet/pods/8ee164a3-fd93-4e47-91b5-d28ead28101b/volumes" Mar 19 15:44:25 crc kubenswrapper[4771]: I0319 15:44:25.509458 4771 scope.go:117] "RemoveContainer" containerID="10cd27af851acc4fdf348971ca1d9cea1c90bae14f8703a63099e2307768de01" Mar 19 15:44:25 crc kubenswrapper[4771]: E0319 15:44:25.510302 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:44:26 crc kubenswrapper[4771]: I0319 15:44:26.509374 4771 scope.go:117] "RemoveContainer" containerID="a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04" Mar 19 15:44:26 crc kubenswrapper[4771]: E0319 15:44:26.509956 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:44:29 crc kubenswrapper[4771]: I0319 15:44:29.509200 4771 scope.go:117] "RemoveContainer" containerID="ae049e51f0eb6aef961237eaaf1901a05adc3ea2bc6e58c7885aff01dfec9f67" Mar 19 15:44:29 crc kubenswrapper[4771]: E0319 15:44:29.509743 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:44:37 crc kubenswrapper[4771]: I0319 15:44:37.509724 4771 scope.go:117] "RemoveContainer" containerID="10cd27af851acc4fdf348971ca1d9cea1c90bae14f8703a63099e2307768de01" Mar 19 15:44:37 crc kubenswrapper[4771]: E0319 15:44:37.510770 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:44:41 crc kubenswrapper[4771]: I0319 15:44:41.514917 4771 scope.go:117] "RemoveContainer" containerID="a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04" Mar 19 15:44:41 crc kubenswrapper[4771]: I0319 15:44:41.515667 4771 scope.go:117] "RemoveContainer" containerID="ae049e51f0eb6aef961237eaaf1901a05adc3ea2bc6e58c7885aff01dfec9f67" Mar 19 15:44:41 crc kubenswrapper[4771]: E0319 15:44:41.515796 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:44:41 crc kubenswrapper[4771]: E0319 15:44:41.515909 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:44:51 crc kubenswrapper[4771]: I0319 15:44:51.517571 4771 scope.go:117] "RemoveContainer" containerID="10cd27af851acc4fdf348971ca1d9cea1c90bae14f8703a63099e2307768de01" Mar 19 15:44:51 crc kubenswrapper[4771]: E0319 15:44:51.518766 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:44:54 crc kubenswrapper[4771]: I0319 15:44:54.509141 4771 scope.go:117] "RemoveContainer" containerID="a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04" Mar 19 15:44:54 crc kubenswrapper[4771]: E0319 15:44:54.509594 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:44:55 crc kubenswrapper[4771]: I0319 15:44:55.510198 4771 scope.go:117] "RemoveContainer" containerID="ae049e51f0eb6aef961237eaaf1901a05adc3ea2bc6e58c7885aff01dfec9f67" Mar 19 15:44:55 crc kubenswrapper[4771]: E0319 15:44:55.510511 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:44:57 crc kubenswrapper[4771]: I0319 15:44:57.774627 4771 scope.go:117] "RemoveContainer" containerID="81c6bd785bc62d983b852e6db2010b3f9af9371c68c00e9c7ac742ac6d30bf2e" Mar 19 15:45:01 crc kubenswrapper[4771]: I0319 15:45:00.157189 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565585-cprmh"] Mar 19 15:45:01 crc kubenswrapper[4771]: E0319 15:45:00.158152 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ee164a3-fd93-4e47-91b5-d28ead28101b" containerName="extract-utilities" Mar 19 15:45:01 crc kubenswrapper[4771]: I0319 15:45:00.158173 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ee164a3-fd93-4e47-91b5-d28ead28101b" containerName="extract-utilities" Mar 19 15:45:01 crc kubenswrapper[4771]: E0319 15:45:00.158189 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ee164a3-fd93-4e47-91b5-d28ead28101b" containerName="registry-server" Mar 19 15:45:01 crc kubenswrapper[4771]: I0319 15:45:00.158198 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ee164a3-fd93-4e47-91b5-d28ead28101b" containerName="registry-server" Mar 19 15:45:01 crc kubenswrapper[4771]: E0319 15:45:00.158231 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ee164a3-fd93-4e47-91b5-d28ead28101b" containerName="extract-content" Mar 19 15:45:01 crc kubenswrapper[4771]: I0319 15:45:00.158239 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ee164a3-fd93-4e47-91b5-d28ead28101b" containerName="extract-content" Mar 19 15:45:01 crc kubenswrapper[4771]: I0319 15:45:00.158465 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ee164a3-fd93-4e47-91b5-d28ead28101b" containerName="registry-server" Mar 19 15:45:01 crc kubenswrapper[4771]: I0319 15:45:00.159137 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565585-cprmh" Mar 19 15:45:01 crc kubenswrapper[4771]: I0319 15:45:00.165794 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 15:45:01 crc kubenswrapper[4771]: I0319 15:45:00.166215 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 15:45:01 crc kubenswrapper[4771]: I0319 15:45:00.180015 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565585-cprmh"] Mar 19 15:45:01 crc kubenswrapper[4771]: I0319 15:45:01.102328 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f140ed61-f23d-47e0-b6f5-4df106968b32-config-volume\") pod \"collect-profiles-29565585-cprmh\" (UID: \"f140ed61-f23d-47e0-b6f5-4df106968b32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565585-cprmh" Mar 19 15:45:01 crc kubenswrapper[4771]: I0319 15:45:01.102441 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v524p\" (UniqueName: \"kubernetes.io/projected/f140ed61-f23d-47e0-b6f5-4df106968b32-kube-api-access-v524p\") pod \"collect-profiles-29565585-cprmh\" (UID: \"f140ed61-f23d-47e0-b6f5-4df106968b32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565585-cprmh" Mar 19 15:45:01 crc kubenswrapper[4771]: I0319 15:45:01.102501 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f140ed61-f23d-47e0-b6f5-4df106968b32-secret-volume\") pod \"collect-profiles-29565585-cprmh\" (UID: \"f140ed61-f23d-47e0-b6f5-4df106968b32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565585-cprmh" Mar 19 15:45:01 crc kubenswrapper[4771]: I0319 15:45:01.203521 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f140ed61-f23d-47e0-b6f5-4df106968b32-config-volume\") pod \"collect-profiles-29565585-cprmh\" (UID: \"f140ed61-f23d-47e0-b6f5-4df106968b32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565585-cprmh" Mar 19 15:45:01 crc kubenswrapper[4771]: I0319 15:45:01.203969 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v524p\" (UniqueName: \"kubernetes.io/projected/f140ed61-f23d-47e0-b6f5-4df106968b32-kube-api-access-v524p\") pod \"collect-profiles-29565585-cprmh\" (UID: \"f140ed61-f23d-47e0-b6f5-4df106968b32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565585-cprmh" Mar 19 15:45:01 crc kubenswrapper[4771]: I0319 15:45:01.204155 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f140ed61-f23d-47e0-b6f5-4df106968b32-secret-volume\") pod \"collect-profiles-29565585-cprmh\" (UID: \"f140ed61-f23d-47e0-b6f5-4df106968b32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565585-cprmh" Mar 19 15:45:01 crc kubenswrapper[4771]: I0319 15:45:01.204434 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f140ed61-f23d-47e0-b6f5-4df106968b32-config-volume\") pod \"collect-profiles-29565585-cprmh\" (UID: \"f140ed61-f23d-47e0-b6f5-4df106968b32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565585-cprmh" Mar 19 15:45:01 crc kubenswrapper[4771]: I0319 15:45:01.218480 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f140ed61-f23d-47e0-b6f5-4df106968b32-secret-volume\") pod \"collect-profiles-29565585-cprmh\" (UID: \"f140ed61-f23d-47e0-b6f5-4df106968b32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565585-cprmh" Mar 19 15:45:01 crc kubenswrapper[4771]: I0319 15:45:01.223220 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v524p\" (UniqueName: \"kubernetes.io/projected/f140ed61-f23d-47e0-b6f5-4df106968b32-kube-api-access-v524p\") pod \"collect-profiles-29565585-cprmh\" (UID: \"f140ed61-f23d-47e0-b6f5-4df106968b32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565585-cprmh" Mar 19 15:45:01 crc kubenswrapper[4771]: I0319 15:45:01.386672 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565585-cprmh" Mar 19 15:45:01 crc kubenswrapper[4771]: I0319 15:45:01.651235 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565585-cprmh"] Mar 19 15:45:01 crc kubenswrapper[4771]: W0319 15:45:01.660031 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf140ed61_f23d_47e0_b6f5_4df106968b32.slice/crio-b9cd5171cb824ffe39646287a0a8eb6faad61ce242d6e59655cc7761c67f1377 WatchSource:0}: Error finding container b9cd5171cb824ffe39646287a0a8eb6faad61ce242d6e59655cc7761c67f1377: Status 404 returned error can't find the container with id b9cd5171cb824ffe39646287a0a8eb6faad61ce242d6e59655cc7761c67f1377 Mar 19 15:45:02 crc kubenswrapper[4771]: I0319 15:45:02.225167 4771 generic.go:334] "Generic (PLEG): container finished" podID="f140ed61-f23d-47e0-b6f5-4df106968b32" containerID="051b102cc58c876d025a4850d35b2a9585940a23bcdcd31ebd9acdc265154e0b" exitCode=0 Mar 19 15:45:02 crc kubenswrapper[4771]: I0319 15:45:02.225275 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565585-cprmh" event={"ID":"f140ed61-f23d-47e0-b6f5-4df106968b32","Type":"ContainerDied","Data":"051b102cc58c876d025a4850d35b2a9585940a23bcdcd31ebd9acdc265154e0b"} Mar 19 15:45:02 crc kubenswrapper[4771]: I0319 15:45:02.225692 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565585-cprmh" event={"ID":"f140ed61-f23d-47e0-b6f5-4df106968b32","Type":"ContainerStarted","Data":"b9cd5171cb824ffe39646287a0a8eb6faad61ce242d6e59655cc7761c67f1377"} Mar 19 15:45:02 crc kubenswrapper[4771]: I0319 15:45:02.509979 4771 scope.go:117] "RemoveContainer" containerID="10cd27af851acc4fdf348971ca1d9cea1c90bae14f8703a63099e2307768de01" Mar 19 15:45:02 crc kubenswrapper[4771]: E0319 15:45:02.510474 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:45:03 crc kubenswrapper[4771]: I0319 15:45:03.572369 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565585-cprmh" Mar 19 15:45:03 crc kubenswrapper[4771]: I0319 15:45:03.653822 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f140ed61-f23d-47e0-b6f5-4df106968b32-config-volume\") pod \"f140ed61-f23d-47e0-b6f5-4df106968b32\" (UID: \"f140ed61-f23d-47e0-b6f5-4df106968b32\") " Mar 19 15:45:03 crc kubenswrapper[4771]: I0319 15:45:03.653906 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f140ed61-f23d-47e0-b6f5-4df106968b32-secret-volume\") pod \"f140ed61-f23d-47e0-b6f5-4df106968b32\" (UID: \"f140ed61-f23d-47e0-b6f5-4df106968b32\") " Mar 19 15:45:03 crc kubenswrapper[4771]: I0319 15:45:03.654028 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v524p\" (UniqueName: \"kubernetes.io/projected/f140ed61-f23d-47e0-b6f5-4df106968b32-kube-api-access-v524p\") pod \"f140ed61-f23d-47e0-b6f5-4df106968b32\" (UID: \"f140ed61-f23d-47e0-b6f5-4df106968b32\") " Mar 19 15:45:03 crc kubenswrapper[4771]: I0319 15:45:03.654963 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f140ed61-f23d-47e0-b6f5-4df106968b32-config-volume" (OuterVolumeSpecName: "config-volume") pod "f140ed61-f23d-47e0-b6f5-4df106968b32" (UID: "f140ed61-f23d-47e0-b6f5-4df106968b32"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 15:45:03 crc kubenswrapper[4771]: I0319 15:45:03.683281 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f140ed61-f23d-47e0-b6f5-4df106968b32-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f140ed61-f23d-47e0-b6f5-4df106968b32" (UID: "f140ed61-f23d-47e0-b6f5-4df106968b32"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 15:45:03 crc kubenswrapper[4771]: I0319 15:45:03.693704 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f140ed61-f23d-47e0-b6f5-4df106968b32-kube-api-access-v524p" (OuterVolumeSpecName: "kube-api-access-v524p") pod "f140ed61-f23d-47e0-b6f5-4df106968b32" (UID: "f140ed61-f23d-47e0-b6f5-4df106968b32"). InnerVolumeSpecName "kube-api-access-v524p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:45:03 crc kubenswrapper[4771]: I0319 15:45:03.755001 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f140ed61-f23d-47e0-b6f5-4df106968b32-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 15:45:03 crc kubenswrapper[4771]: I0319 15:45:03.755031 4771 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f140ed61-f23d-47e0-b6f5-4df106968b32-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 15:45:03 crc kubenswrapper[4771]: I0319 15:45:03.755041 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v524p\" (UniqueName: \"kubernetes.io/projected/f140ed61-f23d-47e0-b6f5-4df106968b32-kube-api-access-v524p\") on node \"crc\" DevicePath \"\"" Mar 19 15:45:04 crc kubenswrapper[4771]: I0319 15:45:04.244922 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565585-cprmh" event={"ID":"f140ed61-f23d-47e0-b6f5-4df106968b32","Type":"ContainerDied","Data":"b9cd5171cb824ffe39646287a0a8eb6faad61ce242d6e59655cc7761c67f1377"} Mar 19 15:45:04 crc kubenswrapper[4771]: I0319 15:45:04.244969 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9cd5171cb824ffe39646287a0a8eb6faad61ce242d6e59655cc7761c67f1377" Mar 19 15:45:04 crc kubenswrapper[4771]: I0319 15:45:04.245028 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565585-cprmh" Mar 19 15:45:09 crc kubenswrapper[4771]: I0319 15:45:09.508811 4771 scope.go:117] "RemoveContainer" containerID="a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04" Mar 19 15:45:09 crc kubenswrapper[4771]: E0319 15:45:09.509629 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:45:10 crc kubenswrapper[4771]: I0319 15:45:10.508824 4771 scope.go:117] "RemoveContainer" containerID="ae049e51f0eb6aef961237eaaf1901a05adc3ea2bc6e58c7885aff01dfec9f67" Mar 19 15:45:10 crc kubenswrapper[4771]: E0319 15:45:10.509134 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:45:15 crc kubenswrapper[4771]: I0319 15:45:15.511671 4771 scope.go:117] "RemoveContainer" containerID="10cd27af851acc4fdf348971ca1d9cea1c90bae14f8703a63099e2307768de01" Mar 19 15:45:15 crc kubenswrapper[4771]: E0319 15:45:15.512513 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:45:24 crc kubenswrapper[4771]: I0319 15:45:24.509096 4771 scope.go:117] "RemoveContainer" containerID="a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04" Mar 19 15:45:24 crc kubenswrapper[4771]: E0319 15:45:24.510310 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:45:25 crc kubenswrapper[4771]: I0319 15:45:25.525101 4771 scope.go:117] "RemoveContainer" containerID="ae049e51f0eb6aef961237eaaf1901a05adc3ea2bc6e58c7885aff01dfec9f67" Mar 19 15:45:25 crc kubenswrapper[4771]: E0319 15:45:25.530801 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:45:29 crc kubenswrapper[4771]: I0319 15:45:29.088196 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-xtskz"] Mar 19 15:45:29 crc kubenswrapper[4771]: I0319 15:45:29.102718 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7c74-account-create-update-zx8k5"] Mar 19 15:45:29 crc kubenswrapper[4771]: I0319 15:45:29.113843 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-70d3-account-create-update-pqjbk"] Mar 19 15:45:29 crc kubenswrapper[4771]: I0319 15:45:29.123151 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-rqggq"] Mar 19 15:45:29 crc kubenswrapper[4771]: I0319 15:45:29.131367 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b5fa-account-create-update-hwb48"] Mar 19 15:45:29 crc kubenswrapper[4771]: I0319 15:45:29.139339 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-hvbsg"] Mar 19 15:45:29 crc kubenswrapper[4771]: I0319 15:45:29.145290 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-7c74-account-create-update-zx8k5"] Mar 19 15:45:29 crc kubenswrapper[4771]: I0319 15:45:29.151073 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-70d3-account-create-update-pqjbk"] Mar 19 15:45:29 crc kubenswrapper[4771]: I0319 15:45:29.156377 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-xtskz"] Mar 19 15:45:29 crc kubenswrapper[4771]: I0319 15:45:29.162275 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b5fa-account-create-update-hwb48"] Mar 19 15:45:29 crc kubenswrapper[4771]: I0319 15:45:29.167964 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-rqggq"] Mar 19 15:45:29 crc kubenswrapper[4771]: I0319 15:45:29.173622 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-hvbsg"] Mar 19 15:45:29 crc kubenswrapper[4771]: I0319 15:45:29.535085 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c0318a8-d871-42c6-aaa9-4c6f07bb90a8" path="/var/lib/kubelet/pods/1c0318a8-d871-42c6-aaa9-4c6f07bb90a8/volumes" Mar 19 15:45:29 crc kubenswrapper[4771]: I0319 15:45:29.536152 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67c93ca3-5940-420d-9ab0-e5a0d2a23964" path="/var/lib/kubelet/pods/67c93ca3-5940-420d-9ab0-e5a0d2a23964/volumes" Mar 19 15:45:29 crc kubenswrapper[4771]: I0319 15:45:29.537238 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67ed00e9-ecdb-408d-8ad6-f4272af25922" path="/var/lib/kubelet/pods/67ed00e9-ecdb-408d-8ad6-f4272af25922/volumes" Mar 19 15:45:29 crc kubenswrapper[4771]: I0319 15:45:29.539654 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="895ea0c2-2780-4e20-8d2d-ed5c378c6cfe" path="/var/lib/kubelet/pods/895ea0c2-2780-4e20-8d2d-ed5c378c6cfe/volumes" Mar 19 15:45:29 crc kubenswrapper[4771]: I0319 15:45:29.541757 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="896559b5-ae6f-439f-be12-bcd65a28e5ec" path="/var/lib/kubelet/pods/896559b5-ae6f-439f-be12-bcd65a28e5ec/volumes" Mar 19 15:45:29 crc kubenswrapper[4771]: I0319 15:45:29.544749 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d50a3cd-a539-4f95-b8e1-e157be63cc4d" path="/var/lib/kubelet/pods/9d50a3cd-a539-4f95-b8e1-e157be63cc4d/volumes" Mar 19 15:45:30 crc kubenswrapper[4771]: I0319 15:45:30.509749 4771 scope.go:117] "RemoveContainer" containerID="10cd27af851acc4fdf348971ca1d9cea1c90bae14f8703a63099e2307768de01" Mar 19 15:45:30 crc kubenswrapper[4771]: E0319 15:45:30.510542 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:45:37 crc kubenswrapper[4771]: I0319 15:45:37.508804 4771 scope.go:117] "RemoveContainer" containerID="a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04" Mar 19 15:45:37 crc kubenswrapper[4771]: E0319 15:45:37.510717 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:45:39 crc kubenswrapper[4771]: I0319 15:45:39.508354 4771 scope.go:117] "RemoveContainer" containerID="ae049e51f0eb6aef961237eaaf1901a05adc3ea2bc6e58c7885aff01dfec9f67" Mar 19 15:45:39 crc kubenswrapper[4771]: E0319 15:45:39.508958 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:45:42 crc kubenswrapper[4771]: I0319 15:45:42.508875 4771 scope.go:117] "RemoveContainer" containerID="10cd27af851acc4fdf348971ca1d9cea1c90bae14f8703a63099e2307768de01" Mar 19 15:45:42 crc kubenswrapper[4771]: E0319 15:45:42.509565 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:45:50 crc kubenswrapper[4771]: I0319 15:45:50.509209 4771 scope.go:117] "RemoveContainer" containerID="a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04" Mar 19 15:45:50 crc kubenswrapper[4771]: E0319 15:45:50.510102 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:45:54 crc kubenswrapper[4771]: I0319 15:45:54.509053 4771 scope.go:117] "RemoveContainer" containerID="ae049e51f0eb6aef961237eaaf1901a05adc3ea2bc6e58c7885aff01dfec9f67" Mar 19 15:45:54 crc kubenswrapper[4771]: E0319 15:45:54.509827 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:45:57 crc kubenswrapper[4771]: I0319 15:45:57.508911 4771 scope.go:117] "RemoveContainer" containerID="10cd27af851acc4fdf348971ca1d9cea1c90bae14f8703a63099e2307768de01" Mar 19 15:45:57 crc kubenswrapper[4771]: E0319 15:45:57.509164 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:45:57 crc kubenswrapper[4771]: I0319 15:45:57.867600 4771 scope.go:117] "RemoveContainer" containerID="7c2b2c5f26acff33e5f2bb0f327430855d0d7081af1251d352873b7e0e34f4e9" Mar 19 15:45:57 crc kubenswrapper[4771]: I0319 15:45:57.894052 4771 scope.go:117] "RemoveContainer" containerID="e1cf2fd7d2344c2ed37b1816e6988dbb86f5d037a89f5e51850bb0881f5a0925" Mar 19 15:45:57 crc kubenswrapper[4771]: I0319 15:45:57.936703 4771 scope.go:117] "RemoveContainer" containerID="39d69706c94058aae53342467ca2ae2a0f6db10d66bda8e28d0593f17c5ed487" Mar 19 15:45:57 crc kubenswrapper[4771]: I0319 15:45:57.976386 4771 scope.go:117] "RemoveContainer" containerID="5e67df495d952d972b57081b28e98d502571f5e99ab8ee5afb6fc36ae4b5177f" Mar 19 15:45:58 crc kubenswrapper[4771]: I0319 15:45:58.021345 4771 scope.go:117] "RemoveContainer" containerID="ba02326d3d8570c40017025d4d0aa13fbbc2b051c78a6e12a88e9d872e0dc123" Mar 19 15:45:58 crc kubenswrapper[4771]: I0319 15:45:58.053381 4771 scope.go:117] "RemoveContainer" containerID="a56201bae3bbf956178a4286e88c593e79f87fbd98eb487de2945489ff26e3f8" Mar 19 15:46:00 crc kubenswrapper[4771]: I0319 15:46:00.145733 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565586-7r4gc"] Mar 19 15:46:00 crc kubenswrapper[4771]: E0319 15:46:00.146585 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f140ed61-f23d-47e0-b6f5-4df106968b32" containerName="collect-profiles" Mar 19 15:46:00 crc kubenswrapper[4771]: I0319 15:46:00.146607 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f140ed61-f23d-47e0-b6f5-4df106968b32" containerName="collect-profiles" Mar 19 15:46:00 crc kubenswrapper[4771]: I0319 15:46:00.146883 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f140ed61-f23d-47e0-b6f5-4df106968b32" containerName="collect-profiles" Mar 19 15:46:00 crc kubenswrapper[4771]: I0319 15:46:00.147651 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565586-7r4gc" Mar 19 15:46:00 crc kubenswrapper[4771]: I0319 15:46:00.150869 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 15:46:00 crc kubenswrapper[4771]: I0319 15:46:00.151814 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k42k7" Mar 19 15:46:00 crc kubenswrapper[4771]: I0319 15:46:00.152348 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 15:46:00 crc kubenswrapper[4771]: I0319 15:46:00.163610 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565586-7r4gc"] Mar 19 15:46:00 crc kubenswrapper[4771]: I0319 15:46:00.266879 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wnsb\" (UniqueName: \"kubernetes.io/projected/cc761847-2b88-49a3-ae95-0229cbb8bd98-kube-api-access-4wnsb\") pod \"auto-csr-approver-29565586-7r4gc\" (UID: \"cc761847-2b88-49a3-ae95-0229cbb8bd98\") " pod="openshift-infra/auto-csr-approver-29565586-7r4gc" Mar 19 15:46:00 crc kubenswrapper[4771]: I0319 15:46:00.368672 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wnsb\" (UniqueName: \"kubernetes.io/projected/cc761847-2b88-49a3-ae95-0229cbb8bd98-kube-api-access-4wnsb\") pod \"auto-csr-approver-29565586-7r4gc\" (UID: \"cc761847-2b88-49a3-ae95-0229cbb8bd98\") " pod="openshift-infra/auto-csr-approver-29565586-7r4gc" Mar 19 15:46:00 crc kubenswrapper[4771]: I0319 15:46:00.393363 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wnsb\" (UniqueName: \"kubernetes.io/projected/cc761847-2b88-49a3-ae95-0229cbb8bd98-kube-api-access-4wnsb\") pod \"auto-csr-approver-29565586-7r4gc\" (UID: \"cc761847-2b88-49a3-ae95-0229cbb8bd98\") " pod="openshift-infra/auto-csr-approver-29565586-7r4gc" Mar 19 15:46:00 crc kubenswrapper[4771]: I0319 15:46:00.466467 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565586-7r4gc" Mar 19 15:46:00 crc kubenswrapper[4771]: I0319 15:46:00.739925 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565586-7r4gc"] Mar 19 15:46:01 crc kubenswrapper[4771]: I0319 15:46:01.750715 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565586-7r4gc" event={"ID":"cc761847-2b88-49a3-ae95-0229cbb8bd98","Type":"ContainerStarted","Data":"7cbb70a90b8d2996df53606bb1fd0cf66816c7a955201ea357acccf0d5357f2a"} Mar 19 15:46:02 crc kubenswrapper[4771]: I0319 15:46:02.761697 4771 generic.go:334] "Generic (PLEG): container finished" podID="cc761847-2b88-49a3-ae95-0229cbb8bd98" containerID="bacf2a660e041a39d1d3ab52925d8cf8a5121c96f14de231eeaebb0311c7feac" exitCode=0 Mar 19 15:46:02 crc kubenswrapper[4771]: I0319 15:46:02.762171 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565586-7r4gc" event={"ID":"cc761847-2b88-49a3-ae95-0229cbb8bd98","Type":"ContainerDied","Data":"bacf2a660e041a39d1d3ab52925d8cf8a5121c96f14de231eeaebb0311c7feac"} Mar 19 15:46:04 crc kubenswrapper[4771]: I0319 15:46:04.042793 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565586-7r4gc" Mar 19 15:46:04 crc kubenswrapper[4771]: I0319 15:46:04.143297 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wnsb\" (UniqueName: \"kubernetes.io/projected/cc761847-2b88-49a3-ae95-0229cbb8bd98-kube-api-access-4wnsb\") pod \"cc761847-2b88-49a3-ae95-0229cbb8bd98\" (UID: \"cc761847-2b88-49a3-ae95-0229cbb8bd98\") " Mar 19 15:46:04 crc kubenswrapper[4771]: I0319 15:46:04.152243 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc761847-2b88-49a3-ae95-0229cbb8bd98-kube-api-access-4wnsb" (OuterVolumeSpecName: "kube-api-access-4wnsb") pod "cc761847-2b88-49a3-ae95-0229cbb8bd98" (UID: "cc761847-2b88-49a3-ae95-0229cbb8bd98"). InnerVolumeSpecName "kube-api-access-4wnsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:46:04 crc kubenswrapper[4771]: I0319 15:46:04.244827 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wnsb\" (UniqueName: \"kubernetes.io/projected/cc761847-2b88-49a3-ae95-0229cbb8bd98-kube-api-access-4wnsb\") on node \"crc\" DevicePath \"\"" Mar 19 15:46:04 crc kubenswrapper[4771]: I0319 15:46:04.786105 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565586-7r4gc" event={"ID":"cc761847-2b88-49a3-ae95-0229cbb8bd98","Type":"ContainerDied","Data":"7cbb70a90b8d2996df53606bb1fd0cf66816c7a955201ea357acccf0d5357f2a"} Mar 19 15:46:04 crc kubenswrapper[4771]: I0319 15:46:04.786157 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cbb70a90b8d2996df53606bb1fd0cf66816c7a955201ea357acccf0d5357f2a" Mar 19 15:46:04 crc kubenswrapper[4771]: I0319 15:46:04.786164 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565586-7r4gc" Mar 19 15:46:05 crc kubenswrapper[4771]: I0319 15:46:05.108131 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565580-9kqg9"] Mar 19 15:46:05 crc kubenswrapper[4771]: I0319 15:46:05.116415 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565580-9kqg9"] Mar 19 15:46:05 crc kubenswrapper[4771]: I0319 15:46:05.509114 4771 scope.go:117] "RemoveContainer" containerID="a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04" Mar 19 15:46:05 crc kubenswrapper[4771]: E0319 15:46:05.509366 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:46:05 crc kubenswrapper[4771]: I0319 15:46:05.529899 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="798cb9bf-2c53-499e-b648-9f9733743e23" path="/var/lib/kubelet/pods/798cb9bf-2c53-499e-b648-9f9733743e23/volumes" Mar 19 15:46:06 crc kubenswrapper[4771]: I0319 15:46:06.034507 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-275z8"] Mar 19 15:46:06 crc kubenswrapper[4771]: I0319 15:46:06.048144 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-275z8"] Mar 19 15:46:07 crc kubenswrapper[4771]: I0319 15:46:07.044407 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-mwcmg"] Mar 19 15:46:07 crc kubenswrapper[4771]: I0319 15:46:07.052490 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-mwcmg"] Mar 19 15:46:07 crc kubenswrapper[4771]: I0319 15:46:07.525214 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04404b99-3d2c-42a9-8137-f9f7991d2d06" path="/var/lib/kubelet/pods/04404b99-3d2c-42a9-8137-f9f7991d2d06/volumes" Mar 19 15:46:07 crc kubenswrapper[4771]: I0319 15:46:07.526257 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a0387e9-2daa-4556-9745-450c0e3015dd" path="/var/lib/kubelet/pods/7a0387e9-2daa-4556-9745-450c0e3015dd/volumes" Mar 19 15:46:08 crc kubenswrapper[4771]: I0319 15:46:08.509977 4771 scope.go:117] "RemoveContainer" containerID="ae049e51f0eb6aef961237eaaf1901a05adc3ea2bc6e58c7885aff01dfec9f67" Mar 19 15:46:08 crc kubenswrapper[4771]: E0319 15:46:08.510506 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:46:10 crc kubenswrapper[4771]: I0319 15:46:10.508742 4771 scope.go:117] "RemoveContainer" containerID="10cd27af851acc4fdf348971ca1d9cea1c90bae14f8703a63099e2307768de01" Mar 19 15:46:10 crc kubenswrapper[4771]: E0319 15:46:10.509193 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:46:17 crc kubenswrapper[4771]: I0319 15:46:17.509057 4771 scope.go:117] "RemoveContainer" containerID="a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04" Mar 19 15:46:17 crc kubenswrapper[4771]: E0319 15:46:17.509540 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:46:21 crc kubenswrapper[4771]: I0319 15:46:21.516503 4771 scope.go:117] "RemoveContainer" containerID="ae049e51f0eb6aef961237eaaf1901a05adc3ea2bc6e58c7885aff01dfec9f67" Mar 19 15:46:21 crc kubenswrapper[4771]: E0319 15:46:21.518180 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:46:25 crc kubenswrapper[4771]: I0319 15:46:25.509047 4771 scope.go:117] "RemoveContainer" containerID="10cd27af851acc4fdf348971ca1d9cea1c90bae14f8703a63099e2307768de01" Mar 19 15:46:25 crc kubenswrapper[4771]: E0319 15:46:25.509630 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:46:29 crc kubenswrapper[4771]: I0319 15:46:29.510657 4771 scope.go:117] "RemoveContainer" containerID="a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04" Mar 19 15:46:29 crc kubenswrapper[4771]: E0319 15:46:29.511466 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:46:36 crc kubenswrapper[4771]: I0319 15:46:36.509117 4771 scope.go:117] "RemoveContainer" containerID="10cd27af851acc4fdf348971ca1d9cea1c90bae14f8703a63099e2307768de01" Mar 19 15:46:36 crc kubenswrapper[4771]: I0319 15:46:36.509762 4771 scope.go:117] "RemoveContainer" containerID="ae049e51f0eb6aef961237eaaf1901a05adc3ea2bc6e58c7885aff01dfec9f67" Mar 19 15:46:36 crc kubenswrapper[4771]: E0319 15:46:36.509959 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:46:36 crc kubenswrapper[4771]: E0319 15:46:36.510130 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:46:40 crc kubenswrapper[4771]: I0319 15:46:40.509398 4771 scope.go:117] "RemoveContainer" containerID="a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04" Mar 19 15:46:40 crc kubenswrapper[4771]: E0319 15:46:40.510306 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:46:47 crc kubenswrapper[4771]: I0319 15:46:47.509023 4771 scope.go:117] "RemoveContainer" containerID="ae049e51f0eb6aef961237eaaf1901a05adc3ea2bc6e58c7885aff01dfec9f67" Mar 19 15:46:47 crc kubenswrapper[4771]: I0319 15:46:47.509790 4771 scope.go:117] "RemoveContainer" containerID="10cd27af851acc4fdf348971ca1d9cea1c90bae14f8703a63099e2307768de01" Mar 19 15:46:47 crc kubenswrapper[4771]: E0319 15:46:47.509985 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:46:47 crc kubenswrapper[4771]: E0319 15:46:47.510204 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:46:53 crc kubenswrapper[4771]: I0319 15:46:53.508672 4771 scope.go:117] "RemoveContainer" containerID="a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04" Mar 19 15:46:53 crc kubenswrapper[4771]: E0319 15:46:53.509525 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:46:58 crc kubenswrapper[4771]: I0319 15:46:58.145414 4771 scope.go:117] "RemoveContainer" containerID="c98219a7ea9b7344e3da87cb392dfdcab8233d431a7d9c9e5dec2e007ecbebad" Mar 19 15:46:58 crc kubenswrapper[4771]: I0319 15:46:58.196385 4771 scope.go:117] "RemoveContainer" containerID="134c2c9bc05f58f8fe7f26996e5f4fb6c536a57fbcb2e4ab153ee21931c8bdd3" Mar 19 15:46:58 crc kubenswrapper[4771]: I0319 15:46:58.260333 4771 scope.go:117] "RemoveContainer" containerID="47c4d06dff1b7e394f1fd7c2fe6908fd24290a72bf595b2b00c23989d14771ee" Mar 19 15:46:59 crc kubenswrapper[4771]: I0319 15:46:59.508379 4771 scope.go:117] "RemoveContainer" containerID="ae049e51f0eb6aef961237eaaf1901a05adc3ea2bc6e58c7885aff01dfec9f67" Mar 19 15:47:00 crc kubenswrapper[4771]: I0319 15:47:00.273006 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c065c328-37e2-4905-9d1e-82208eab196e","Type":"ContainerStarted","Data":"b679eb3a4980c447710e4dcc79c2ab79590325c537c0399ecfa57c365bb2ecc3"} Mar 19 15:47:00 crc kubenswrapper[4771]: I0319 15:47:00.273325 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:47:01 crc kubenswrapper[4771]: I0319 15:47:01.513696 4771 scope.go:117] "RemoveContainer" containerID="10cd27af851acc4fdf348971ca1d9cea1c90bae14f8703a63099e2307768de01" Mar 19 15:47:02 crc kubenswrapper[4771]: I0319 15:47:02.292514 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74c5f622-0ced-47f9-80d5-75a09acfafc0","Type":"ContainerStarted","Data":"1ef12b655dd9ef435998cead844899fa42656fb2e2850f4f2ef645f3aa3016ce"} Mar 19 15:47:02 crc kubenswrapper[4771]: I0319 15:47:02.293409 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 19 15:47:04 crc kubenswrapper[4771]: I0319 15:47:04.316449 4771 generic.go:334] "Generic (PLEG): container finished" podID="c065c328-37e2-4905-9d1e-82208eab196e" containerID="b679eb3a4980c447710e4dcc79c2ab79590325c537c0399ecfa57c365bb2ecc3" exitCode=0 Mar 19 15:47:04 crc kubenswrapper[4771]: I0319 15:47:04.316548 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c065c328-37e2-4905-9d1e-82208eab196e","Type":"ContainerDied","Data":"b679eb3a4980c447710e4dcc79c2ab79590325c537c0399ecfa57c365bb2ecc3"} Mar 19 15:47:04 crc kubenswrapper[4771]: I0319 15:47:04.316915 4771 scope.go:117] "RemoveContainer" containerID="ae049e51f0eb6aef961237eaaf1901a05adc3ea2bc6e58c7885aff01dfec9f67" Mar 19 15:47:04 crc kubenswrapper[4771]: I0319 15:47:04.317904 4771 scope.go:117] "RemoveContainer" containerID="b679eb3a4980c447710e4dcc79c2ab79590325c537c0399ecfa57c365bb2ecc3" Mar 19 15:47:04 crc kubenswrapper[4771]: E0319 15:47:04.318435 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:47:05 crc kubenswrapper[4771]: I0319 15:47:05.508741 4771 scope.go:117] "RemoveContainer" containerID="a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04" Mar 19 15:47:05 crc kubenswrapper[4771]: E0319 15:47:05.509326 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:47:06 crc kubenswrapper[4771]: I0319 15:47:06.340788 4771 generic.go:334] "Generic (PLEG): container finished" podID="74c5f622-0ced-47f9-80d5-75a09acfafc0" containerID="1ef12b655dd9ef435998cead844899fa42656fb2e2850f4f2ef645f3aa3016ce" exitCode=0 Mar 19 15:47:06 crc kubenswrapper[4771]: I0319 15:47:06.340893 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74c5f622-0ced-47f9-80d5-75a09acfafc0","Type":"ContainerDied","Data":"1ef12b655dd9ef435998cead844899fa42656fb2e2850f4f2ef645f3aa3016ce"} Mar 19 15:47:06 crc kubenswrapper[4771]: I0319 15:47:06.341201 4771 scope.go:117] "RemoveContainer" containerID="10cd27af851acc4fdf348971ca1d9cea1c90bae14f8703a63099e2307768de01" Mar 19 15:47:06 crc kubenswrapper[4771]: I0319 15:47:06.342322 4771 scope.go:117] "RemoveContainer" containerID="1ef12b655dd9ef435998cead844899fa42656fb2e2850f4f2ef645f3aa3016ce" Mar 19 15:47:06 crc kubenswrapper[4771]: E0319 15:47:06.342859 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:47:18 crc kubenswrapper[4771]: I0319 15:47:18.509494 4771 scope.go:117] "RemoveContainer" containerID="a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04" Mar 19 15:47:18 crc kubenswrapper[4771]: E0319 15:47:18.510542 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:47:18 crc kubenswrapper[4771]: I0319 15:47:18.510797 4771 scope.go:117] "RemoveContainer" containerID="b679eb3a4980c447710e4dcc79c2ab79590325c537c0399ecfa57c365bb2ecc3" Mar 19 15:47:18 crc kubenswrapper[4771]: E0319 15:47:18.511092 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:47:19 crc kubenswrapper[4771]: I0319 15:47:19.513382 4771 scope.go:117] "RemoveContainer" containerID="1ef12b655dd9ef435998cead844899fa42656fb2e2850f4f2ef645f3aa3016ce" Mar 19 15:47:19 crc kubenswrapper[4771]: E0319 15:47:19.513804 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:47:29 crc kubenswrapper[4771]: I0319 15:47:29.510808 4771 scope.go:117] "RemoveContainer" containerID="b679eb3a4980c447710e4dcc79c2ab79590325c537c0399ecfa57c365bb2ecc3" Mar 19 15:47:29 crc kubenswrapper[4771]: E0319 15:47:29.511871 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:47:31 crc kubenswrapper[4771]: I0319 15:47:31.515767 4771 scope.go:117] "RemoveContainer" containerID="1ef12b655dd9ef435998cead844899fa42656fb2e2850f4f2ef645f3aa3016ce" Mar 19 15:47:31 crc kubenswrapper[4771]: E0319 15:47:31.516474 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:47:33 crc kubenswrapper[4771]: I0319 15:47:33.509317 4771 scope.go:117] "RemoveContainer" containerID="a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04" Mar 19 15:47:34 crc kubenswrapper[4771]: I0319 15:47:34.597244 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" event={"ID":"f2b6e948-bbef-4217-b0eb-4cdbf711037c","Type":"ContainerStarted","Data":"07a56d01ab75b71b5af456be4839284ab56bbaec8d73ef490a93e8d9a6ad4e01"} Mar 19 15:47:43 crc kubenswrapper[4771]: I0319 15:47:43.508723 4771 scope.go:117] "RemoveContainer" containerID="1ef12b655dd9ef435998cead844899fa42656fb2e2850f4f2ef645f3aa3016ce" Mar 19 15:47:43 crc kubenswrapper[4771]: E0319 15:47:43.509499 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:47:44 crc kubenswrapper[4771]: I0319 15:47:44.509123 4771 scope.go:117] "RemoveContainer" containerID="b679eb3a4980c447710e4dcc79c2ab79590325c537c0399ecfa57c365bb2ecc3" Mar 19 15:47:44 crc kubenswrapper[4771]: E0319 15:47:44.509497 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:47:55 crc kubenswrapper[4771]: I0319 15:47:55.508602 4771 scope.go:117] "RemoveContainer" containerID="1ef12b655dd9ef435998cead844899fa42656fb2e2850f4f2ef645f3aa3016ce" Mar 19 15:47:55 crc kubenswrapper[4771]: E0319 15:47:55.509185 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:47:58 crc kubenswrapper[4771]: I0319 15:47:58.508739 4771 scope.go:117] "RemoveContainer" containerID="b679eb3a4980c447710e4dcc79c2ab79590325c537c0399ecfa57c365bb2ecc3" Mar 19 15:47:58 crc kubenswrapper[4771]: E0319 15:47:58.509364 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:48:00 crc kubenswrapper[4771]: I0319 15:48:00.166832 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565588-zswmt"] Mar 19 15:48:00 crc kubenswrapper[4771]: E0319 15:48:00.167673 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc761847-2b88-49a3-ae95-0229cbb8bd98" containerName="oc" Mar 19 15:48:00 crc kubenswrapper[4771]: I0319 15:48:00.167695 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc761847-2b88-49a3-ae95-0229cbb8bd98" containerName="oc" Mar 19 15:48:00 crc kubenswrapper[4771]: I0319 15:48:00.168087 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc761847-2b88-49a3-ae95-0229cbb8bd98" containerName="oc" Mar 19 15:48:00 crc kubenswrapper[4771]: I0319 15:48:00.168903 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565588-zswmt" Mar 19 15:48:00 crc kubenswrapper[4771]: I0319 15:48:00.171756 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 15:48:00 crc kubenswrapper[4771]: I0319 15:48:00.172402 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k42k7" Mar 19 15:48:00 crc kubenswrapper[4771]: I0319 15:48:00.173955 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 15:48:00 crc kubenswrapper[4771]: I0319 15:48:00.180879 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565588-zswmt"] Mar 19 15:48:00 crc kubenswrapper[4771]: I0319 15:48:00.259975 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvp5x\" (UniqueName: \"kubernetes.io/projected/f245468b-c53c-4eff-9a82-a8ed0153a10d-kube-api-access-cvp5x\") pod \"auto-csr-approver-29565588-zswmt\" (UID: \"f245468b-c53c-4eff-9a82-a8ed0153a10d\") " pod="openshift-infra/auto-csr-approver-29565588-zswmt" Mar 19 15:48:00 crc kubenswrapper[4771]: I0319 15:48:00.361806 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvp5x\" (UniqueName: \"kubernetes.io/projected/f245468b-c53c-4eff-9a82-a8ed0153a10d-kube-api-access-cvp5x\") pod \"auto-csr-approver-29565588-zswmt\" (UID: \"f245468b-c53c-4eff-9a82-a8ed0153a10d\") " pod="openshift-infra/auto-csr-approver-29565588-zswmt" Mar 19 15:48:00 crc kubenswrapper[4771]: I0319 15:48:00.383969 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvp5x\" (UniqueName: \"kubernetes.io/projected/f245468b-c53c-4eff-9a82-a8ed0153a10d-kube-api-access-cvp5x\") pod \"auto-csr-approver-29565588-zswmt\" (UID: \"f245468b-c53c-4eff-9a82-a8ed0153a10d\") " pod="openshift-infra/auto-csr-approver-29565588-zswmt" Mar 19 15:48:00 crc kubenswrapper[4771]: I0319 15:48:00.498325 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565588-zswmt" Mar 19 15:48:00 crc kubenswrapper[4771]: I0319 15:48:00.724923 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565588-zswmt"] Mar 19 15:48:00 crc kubenswrapper[4771]: I0319 15:48:00.850076 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565588-zswmt" event={"ID":"f245468b-c53c-4eff-9a82-a8ed0153a10d","Type":"ContainerStarted","Data":"08784bbf8be656a7dd964083553164e4bc6bee12df010d95fd59f1df7d2ad538"} Mar 19 15:48:02 crc kubenswrapper[4771]: I0319 15:48:02.872608 4771 generic.go:334] "Generic (PLEG): container finished" podID="f245468b-c53c-4eff-9a82-a8ed0153a10d" containerID="928457a83c024a6414511c683217b3b727209f17c6a604b5196a63cb6a243886" exitCode=0 Mar 19 15:48:02 crc kubenswrapper[4771]: I0319 15:48:02.872685 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565588-zswmt" event={"ID":"f245468b-c53c-4eff-9a82-a8ed0153a10d","Type":"ContainerDied","Data":"928457a83c024a6414511c683217b3b727209f17c6a604b5196a63cb6a243886"} Mar 19 15:48:04 crc kubenswrapper[4771]: I0319 15:48:04.234191 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565588-zswmt" Mar 19 15:48:04 crc kubenswrapper[4771]: I0319 15:48:04.334076 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvp5x\" (UniqueName: \"kubernetes.io/projected/f245468b-c53c-4eff-9a82-a8ed0153a10d-kube-api-access-cvp5x\") pod \"f245468b-c53c-4eff-9a82-a8ed0153a10d\" (UID: \"f245468b-c53c-4eff-9a82-a8ed0153a10d\") " Mar 19 15:48:04 crc kubenswrapper[4771]: I0319 15:48:04.342495 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f245468b-c53c-4eff-9a82-a8ed0153a10d-kube-api-access-cvp5x" (OuterVolumeSpecName: "kube-api-access-cvp5x") pod "f245468b-c53c-4eff-9a82-a8ed0153a10d" (UID: "f245468b-c53c-4eff-9a82-a8ed0153a10d"). InnerVolumeSpecName "kube-api-access-cvp5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:48:04 crc kubenswrapper[4771]: I0319 15:48:04.436525 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvp5x\" (UniqueName: \"kubernetes.io/projected/f245468b-c53c-4eff-9a82-a8ed0153a10d-kube-api-access-cvp5x\") on node \"crc\" DevicePath \"\"" Mar 19 15:48:04 crc kubenswrapper[4771]: I0319 15:48:04.895361 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565588-zswmt" event={"ID":"f245468b-c53c-4eff-9a82-a8ed0153a10d","Type":"ContainerDied","Data":"08784bbf8be656a7dd964083553164e4bc6bee12df010d95fd59f1df7d2ad538"} Mar 19 15:48:04 crc kubenswrapper[4771]: I0319 15:48:04.895417 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08784bbf8be656a7dd964083553164e4bc6bee12df010d95fd59f1df7d2ad538" Mar 19 15:48:04 crc kubenswrapper[4771]: I0319 15:48:04.895479 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565588-zswmt" Mar 19 15:48:05 crc kubenswrapper[4771]: I0319 15:48:05.324487 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565582-x76kx"] Mar 19 15:48:05 crc kubenswrapper[4771]: I0319 15:48:05.331285 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565582-x76kx"] Mar 19 15:48:05 crc kubenswrapper[4771]: I0319 15:48:05.518697 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f5e980b-fada-4009-926b-30e270e6b7bb" path="/var/lib/kubelet/pods/7f5e980b-fada-4009-926b-30e270e6b7bb/volumes" Mar 19 15:48:09 crc kubenswrapper[4771]: I0319 15:48:09.509958 4771 scope.go:117] "RemoveContainer" containerID="b679eb3a4980c447710e4dcc79c2ab79590325c537c0399ecfa57c365bb2ecc3" Mar 19 15:48:09 crc kubenswrapper[4771]: I0319 15:48:09.510706 4771 scope.go:117] "RemoveContainer" containerID="1ef12b655dd9ef435998cead844899fa42656fb2e2850f4f2ef645f3aa3016ce" Mar 19 15:48:09 crc kubenswrapper[4771]: E0319 15:48:09.510845 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:48:09 crc kubenswrapper[4771]: E0319 15:48:09.511301 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:48:20 crc kubenswrapper[4771]: I0319 15:48:20.509128 4771 scope.go:117] "RemoveContainer" containerID="b679eb3a4980c447710e4dcc79c2ab79590325c537c0399ecfa57c365bb2ecc3" Mar 19 15:48:20 crc kubenswrapper[4771]: E0319 15:48:20.510552 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:48:23 crc kubenswrapper[4771]: I0319 15:48:23.508952 4771 scope.go:117] "RemoveContainer" containerID="1ef12b655dd9ef435998cead844899fa42656fb2e2850f4f2ef645f3aa3016ce" Mar 19 15:48:23 crc kubenswrapper[4771]: E0319 15:48:23.512619 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:48:34 crc kubenswrapper[4771]: I0319 15:48:34.509477 4771 scope.go:117] "RemoveContainer" containerID="b679eb3a4980c447710e4dcc79c2ab79590325c537c0399ecfa57c365bb2ecc3" Mar 19 15:48:34 crc kubenswrapper[4771]: E0319 15:48:34.510400 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:48:36 crc kubenswrapper[4771]: I0319 15:48:36.509468 4771 scope.go:117] "RemoveContainer" containerID="1ef12b655dd9ef435998cead844899fa42656fb2e2850f4f2ef645f3aa3016ce" Mar 19 15:48:36 crc kubenswrapper[4771]: E0319 15:48:36.510275 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:48:48 crc kubenswrapper[4771]: I0319 15:48:48.508691 4771 scope.go:117] "RemoveContainer" containerID="b679eb3a4980c447710e4dcc79c2ab79590325c537c0399ecfa57c365bb2ecc3" Mar 19 15:48:48 crc kubenswrapper[4771]: E0319 15:48:48.509812 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:48:51 crc kubenswrapper[4771]: I0319 15:48:51.514865 4771 scope.go:117] "RemoveContainer" containerID="1ef12b655dd9ef435998cead844899fa42656fb2e2850f4f2ef645f3aa3016ce" Mar 19 15:48:51 crc kubenswrapper[4771]: E0319 15:48:51.515744 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:48:58 crc kubenswrapper[4771]: I0319 15:48:58.355150 4771 scope.go:117] "RemoveContainer" containerID="1246bfeafc5a6ea6bed8fa76649b760f812907cb48d823008a79b7b4269e59fa" Mar 19 15:48:59 crc kubenswrapper[4771]: I0319 15:48:59.509332 4771 scope.go:117] "RemoveContainer" containerID="b679eb3a4980c447710e4dcc79c2ab79590325c537c0399ecfa57c365bb2ecc3" Mar 19 15:48:59 crc kubenswrapper[4771]: E0319 15:48:59.510281 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:49:02 crc kubenswrapper[4771]: I0319 15:49:02.508902 4771 scope.go:117] "RemoveContainer" containerID="1ef12b655dd9ef435998cead844899fa42656fb2e2850f4f2ef645f3aa3016ce" Mar 19 15:49:02 crc kubenswrapper[4771]: E0319 15:49:02.509401 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:49:14 crc kubenswrapper[4771]: I0319 15:49:14.508596 4771 scope.go:117] "RemoveContainer" containerID="1ef12b655dd9ef435998cead844899fa42656fb2e2850f4f2ef645f3aa3016ce" Mar 19 15:49:14 crc kubenswrapper[4771]: E0319 15:49:14.509344 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:49:14 crc kubenswrapper[4771]: I0319 15:49:14.509919 4771 scope.go:117] "RemoveContainer" containerID="b679eb3a4980c447710e4dcc79c2ab79590325c537c0399ecfa57c365bb2ecc3" Mar 19 15:49:14 crc kubenswrapper[4771]: E0319 15:49:14.510163 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:49:27 crc kubenswrapper[4771]: I0319 15:49:27.509688 4771 scope.go:117] "RemoveContainer" containerID="b679eb3a4980c447710e4dcc79c2ab79590325c537c0399ecfa57c365bb2ecc3" Mar 19 15:49:27 crc kubenswrapper[4771]: I0319 15:49:27.510385 4771 scope.go:117] "RemoveContainer" containerID="1ef12b655dd9ef435998cead844899fa42656fb2e2850f4f2ef645f3aa3016ce" Mar 19 15:49:27 crc kubenswrapper[4771]: E0319 15:49:27.513069 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:49:27 crc kubenswrapper[4771]: E0319 15:49:27.519875 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:49:40 crc kubenswrapper[4771]: I0319 15:49:40.510447 4771 scope.go:117] "RemoveContainer" containerID="1ef12b655dd9ef435998cead844899fa42656fb2e2850f4f2ef645f3aa3016ce" Mar 19 15:49:40 crc kubenswrapper[4771]: E0319 15:49:40.511335 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:49:41 crc kubenswrapper[4771]: I0319 15:49:41.515587 4771 scope.go:117] "RemoveContainer" containerID="b679eb3a4980c447710e4dcc79c2ab79590325c537c0399ecfa57c365bb2ecc3" Mar 19 15:49:41 crc kubenswrapper[4771]: E0319 15:49:41.515824 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:49:52 crc kubenswrapper[4771]: I0319 15:49:52.508463 4771 scope.go:117] "RemoveContainer" containerID="b679eb3a4980c447710e4dcc79c2ab79590325c537c0399ecfa57c365bb2ecc3" Mar 19 15:49:52 crc kubenswrapper[4771]: E0319 15:49:52.509602 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:49:53 crc kubenswrapper[4771]: I0319 15:49:53.027791 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:49:53 crc kubenswrapper[4771]: I0319 15:49:53.027856 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:49:55 crc kubenswrapper[4771]: I0319 15:49:55.508792 4771 scope.go:117] "RemoveContainer" containerID="1ef12b655dd9ef435998cead844899fa42656fb2e2850f4f2ef645f3aa3016ce" Mar 19 15:49:55 crc kubenswrapper[4771]: E0319 15:49:55.509498 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:50:00 crc kubenswrapper[4771]: I0319 15:50:00.150584 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565590-f82zb"] Mar 19 15:50:00 crc kubenswrapper[4771]: E0319 15:50:00.151393 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f245468b-c53c-4eff-9a82-a8ed0153a10d" containerName="oc" Mar 19 15:50:00 crc kubenswrapper[4771]: I0319 15:50:00.151405 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f245468b-c53c-4eff-9a82-a8ed0153a10d" containerName="oc" Mar 19 15:50:00 crc kubenswrapper[4771]: I0319 15:50:00.151581 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f245468b-c53c-4eff-9a82-a8ed0153a10d" containerName="oc" Mar 19 15:50:00 crc kubenswrapper[4771]: I0319 15:50:00.152117 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565590-f82zb" Mar 19 15:50:00 crc kubenswrapper[4771]: I0319 15:50:00.158919 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k42k7" Mar 19 15:50:00 crc kubenswrapper[4771]: I0319 15:50:00.159088 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 15:50:00 crc kubenswrapper[4771]: I0319 15:50:00.159100 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 15:50:00 crc kubenswrapper[4771]: I0319 15:50:00.172393 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565590-f82zb"] Mar 19 15:50:00 crc kubenswrapper[4771]: I0319 15:50:00.266935 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjhdb\" (UniqueName: \"kubernetes.io/projected/f19c29be-d0ee-4c67-a8d4-340e1d97108f-kube-api-access-pjhdb\") pod \"auto-csr-approver-29565590-f82zb\" (UID: \"f19c29be-d0ee-4c67-a8d4-340e1d97108f\") " pod="openshift-infra/auto-csr-approver-29565590-f82zb" Mar 19 15:50:00 crc kubenswrapper[4771]: I0319 15:50:00.368442 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjhdb\" (UniqueName: \"kubernetes.io/projected/f19c29be-d0ee-4c67-a8d4-340e1d97108f-kube-api-access-pjhdb\") pod \"auto-csr-approver-29565590-f82zb\" (UID: \"f19c29be-d0ee-4c67-a8d4-340e1d97108f\") " pod="openshift-infra/auto-csr-approver-29565590-f82zb" Mar 19 15:50:00 crc kubenswrapper[4771]: I0319 15:50:00.393537 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjhdb\" (UniqueName: \"kubernetes.io/projected/f19c29be-d0ee-4c67-a8d4-340e1d97108f-kube-api-access-pjhdb\") pod \"auto-csr-approver-29565590-f82zb\" (UID: \"f19c29be-d0ee-4c67-a8d4-340e1d97108f\") " pod="openshift-infra/auto-csr-approver-29565590-f82zb" Mar 19 15:50:00 crc kubenswrapper[4771]: I0319 15:50:00.471509 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565590-f82zb" Mar 19 15:50:00 crc kubenswrapper[4771]: I0319 15:50:00.997592 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565590-f82zb"] Mar 19 15:50:01 crc kubenswrapper[4771]: W0319 15:50:01.003200 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf19c29be_d0ee_4c67_a8d4_340e1d97108f.slice/crio-46004a7ce3e7b47426ef0341b813ad97603589e359bb2de81fb10f54821a7692 WatchSource:0}: Error finding container 46004a7ce3e7b47426ef0341b813ad97603589e359bb2de81fb10f54821a7692: Status 404 returned error can't find the container with id 46004a7ce3e7b47426ef0341b813ad97603589e359bb2de81fb10f54821a7692 Mar 19 15:50:01 crc kubenswrapper[4771]: I0319 15:50:01.004947 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 15:50:02 crc kubenswrapper[4771]: I0319 15:50:02.003872 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565590-f82zb" event={"ID":"f19c29be-d0ee-4c67-a8d4-340e1d97108f","Type":"ContainerStarted","Data":"46004a7ce3e7b47426ef0341b813ad97603589e359bb2de81fb10f54821a7692"} Mar 19 15:50:03 crc kubenswrapper[4771]: I0319 15:50:03.015237 4771 generic.go:334] "Generic (PLEG): container finished" podID="f19c29be-d0ee-4c67-a8d4-340e1d97108f" containerID="ee0d28b104df4a1c35428059af049cdbc7b79c6630c3781b1d122d96edfaa250" exitCode=0 Mar 19 15:50:03 crc kubenswrapper[4771]: I0319 15:50:03.015311 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565590-f82zb" event={"ID":"f19c29be-d0ee-4c67-a8d4-340e1d97108f","Type":"ContainerDied","Data":"ee0d28b104df4a1c35428059af049cdbc7b79c6630c3781b1d122d96edfaa250"} Mar 19 15:50:04 crc kubenswrapper[4771]: I0319 15:50:04.305043 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565590-f82zb" Mar 19 15:50:04 crc kubenswrapper[4771]: I0319 15:50:04.331340 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjhdb\" (UniqueName: \"kubernetes.io/projected/f19c29be-d0ee-4c67-a8d4-340e1d97108f-kube-api-access-pjhdb\") pod \"f19c29be-d0ee-4c67-a8d4-340e1d97108f\" (UID: \"f19c29be-d0ee-4c67-a8d4-340e1d97108f\") " Mar 19 15:50:04 crc kubenswrapper[4771]: I0319 15:50:04.338549 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f19c29be-d0ee-4c67-a8d4-340e1d97108f-kube-api-access-pjhdb" (OuterVolumeSpecName: "kube-api-access-pjhdb") pod "f19c29be-d0ee-4c67-a8d4-340e1d97108f" (UID: "f19c29be-d0ee-4c67-a8d4-340e1d97108f"). InnerVolumeSpecName "kube-api-access-pjhdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:50:04 crc kubenswrapper[4771]: I0319 15:50:04.434613 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjhdb\" (UniqueName: \"kubernetes.io/projected/f19c29be-d0ee-4c67-a8d4-340e1d97108f-kube-api-access-pjhdb\") on node \"crc\" DevicePath \"\"" Mar 19 15:50:05 crc kubenswrapper[4771]: I0319 15:50:05.037102 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565590-f82zb" event={"ID":"f19c29be-d0ee-4c67-a8d4-340e1d97108f","Type":"ContainerDied","Data":"46004a7ce3e7b47426ef0341b813ad97603589e359bb2de81fb10f54821a7692"} Mar 19 15:50:05 crc kubenswrapper[4771]: I0319 15:50:05.037187 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46004a7ce3e7b47426ef0341b813ad97603589e359bb2de81fb10f54821a7692" Mar 19 15:50:05 crc kubenswrapper[4771]: I0319 15:50:05.037260 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565590-f82zb" Mar 19 15:50:05 crc kubenswrapper[4771]: I0319 15:50:05.404565 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565584-rnpv2"] Mar 19 15:50:05 crc kubenswrapper[4771]: I0319 15:50:05.421444 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565584-rnpv2"] Mar 19 15:50:05 crc kubenswrapper[4771]: I0319 15:50:05.521710 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54c3c4d7-90d4-4e95-8133-90a9e91a975f" path="/var/lib/kubelet/pods/54c3c4d7-90d4-4e95-8133-90a9e91a975f/volumes" Mar 19 15:50:06 crc kubenswrapper[4771]: I0319 15:50:06.510150 4771 scope.go:117] "RemoveContainer" containerID="b679eb3a4980c447710e4dcc79c2ab79590325c537c0399ecfa57c365bb2ecc3" Mar 19 15:50:06 crc kubenswrapper[4771]: E0319 15:50:06.510852 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:50:07 crc kubenswrapper[4771]: I0319 15:50:07.509199 4771 scope.go:117] "RemoveContainer" containerID="1ef12b655dd9ef435998cead844899fa42656fb2e2850f4f2ef645f3aa3016ce" Mar 19 15:50:07 crc kubenswrapper[4771]: E0319 15:50:07.509657 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:50:18 crc kubenswrapper[4771]: I0319 15:50:18.508705 4771 scope.go:117] "RemoveContainer" containerID="b679eb3a4980c447710e4dcc79c2ab79590325c537c0399ecfa57c365bb2ecc3" Mar 19 15:50:18 crc kubenswrapper[4771]: E0319 15:50:18.509571 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:50:19 crc kubenswrapper[4771]: I0319 15:50:19.509082 4771 scope.go:117] "RemoveContainer" containerID="1ef12b655dd9ef435998cead844899fa42656fb2e2850f4f2ef645f3aa3016ce" Mar 19 15:50:19 crc kubenswrapper[4771]: E0319 15:50:19.509538 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:50:23 crc kubenswrapper[4771]: I0319 15:50:23.027148 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:50:23 crc kubenswrapper[4771]: I0319 15:50:23.027749 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:50:32 crc kubenswrapper[4771]: I0319 15:50:32.508703 4771 scope.go:117] "RemoveContainer" containerID="b679eb3a4980c447710e4dcc79c2ab79590325c537c0399ecfa57c365bb2ecc3" Mar 19 15:50:32 crc kubenswrapper[4771]: E0319 15:50:32.509552 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:50:34 crc kubenswrapper[4771]: I0319 15:50:34.509546 4771 scope.go:117] "RemoveContainer" containerID="1ef12b655dd9ef435998cead844899fa42656fb2e2850f4f2ef645f3aa3016ce" Mar 19 15:50:34 crc kubenswrapper[4771]: E0319 15:50:34.510322 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:50:44 crc kubenswrapper[4771]: I0319 15:50:44.509153 4771 scope.go:117] "RemoveContainer" containerID="b679eb3a4980c447710e4dcc79c2ab79590325c537c0399ecfa57c365bb2ecc3" Mar 19 15:50:44 crc kubenswrapper[4771]: E0319 15:50:44.511156 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:50:47 crc kubenswrapper[4771]: I0319 15:50:47.509499 4771 scope.go:117] "RemoveContainer" containerID="1ef12b655dd9ef435998cead844899fa42656fb2e2850f4f2ef645f3aa3016ce" Mar 19 15:50:47 crc kubenswrapper[4771]: E0319 15:50:47.510232 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:50:53 crc kubenswrapper[4771]: I0319 15:50:53.027156 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:50:53 crc kubenswrapper[4771]: I0319 15:50:53.028103 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:50:53 crc kubenswrapper[4771]: I0319 15:50:53.028165 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" Mar 19 15:50:53 crc kubenswrapper[4771]: I0319 15:50:53.029077 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"07a56d01ab75b71b5af456be4839284ab56bbaec8d73ef490a93e8d9a6ad4e01"} pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 15:50:53 crc kubenswrapper[4771]: I0319 15:50:53.029137 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" containerID="cri-o://07a56d01ab75b71b5af456be4839284ab56bbaec8d73ef490a93e8d9a6ad4e01" gracePeriod=600 Mar 19 15:50:53 crc kubenswrapper[4771]: I0319 15:50:53.482458 4771 generic.go:334] "Generic (PLEG): container finished" podID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerID="07a56d01ab75b71b5af456be4839284ab56bbaec8d73ef490a93e8d9a6ad4e01" exitCode=0 Mar 19 15:50:53 crc kubenswrapper[4771]: I0319 15:50:53.482540 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" event={"ID":"f2b6e948-bbef-4217-b0eb-4cdbf711037c","Type":"ContainerDied","Data":"07a56d01ab75b71b5af456be4839284ab56bbaec8d73ef490a93e8d9a6ad4e01"} Mar 19 15:50:53 crc kubenswrapper[4771]: I0319 15:50:53.482865 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" event={"ID":"f2b6e948-bbef-4217-b0eb-4cdbf711037c","Type":"ContainerStarted","Data":"308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100"} Mar 19 15:50:53 crc kubenswrapper[4771]: I0319 15:50:53.482885 4771 scope.go:117] "RemoveContainer" containerID="a19dffeaf2eb71fe674c0c3052ea794b85c45c174ea3cb8014ccbe0b1e7a3a04" Mar 19 15:50:55 crc kubenswrapper[4771]: I0319 15:50:55.509578 4771 scope.go:117] "RemoveContainer" containerID="b679eb3a4980c447710e4dcc79c2ab79590325c537c0399ecfa57c365bb2ecc3" Mar 19 15:50:55 crc kubenswrapper[4771]: E0319 15:50:55.510472 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:50:58 crc kubenswrapper[4771]: I0319 15:50:58.474399 4771 scope.go:117] "RemoveContainer" containerID="eb9a9d31b336a99e462320135f148cfd4d99361607881e25165c6509dfec7626" Mar 19 15:51:00 crc kubenswrapper[4771]: I0319 15:51:00.509555 4771 scope.go:117] "RemoveContainer" containerID="1ef12b655dd9ef435998cead844899fa42656fb2e2850f4f2ef645f3aa3016ce" Mar 19 15:51:00 crc kubenswrapper[4771]: E0319 15:51:00.510309 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:51:09 crc kubenswrapper[4771]: I0319 15:51:09.508507 4771 scope.go:117] "RemoveContainer" containerID="b679eb3a4980c447710e4dcc79c2ab79590325c537c0399ecfa57c365bb2ecc3" Mar 19 15:51:09 crc kubenswrapper[4771]: E0319 15:51:09.509349 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:51:14 crc kubenswrapper[4771]: I0319 15:51:14.508808 4771 scope.go:117] "RemoveContainer" containerID="1ef12b655dd9ef435998cead844899fa42656fb2e2850f4f2ef645f3aa3016ce" Mar 19 15:51:14 crc kubenswrapper[4771]: E0319 15:51:14.510242 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:51:24 crc kubenswrapper[4771]: I0319 15:51:24.509560 4771 scope.go:117] "RemoveContainer" containerID="b679eb3a4980c447710e4dcc79c2ab79590325c537c0399ecfa57c365bb2ecc3" Mar 19 15:51:24 crc kubenswrapper[4771]: E0319 15:51:24.512339 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:51:24 crc kubenswrapper[4771]: I0319 15:51:24.616022 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-scb2s"] Mar 19 15:51:24 crc kubenswrapper[4771]: E0319 15:51:24.619564 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f19c29be-d0ee-4c67-a8d4-340e1d97108f" containerName="oc" Mar 19 15:51:24 crc kubenswrapper[4771]: I0319 15:51:24.619609 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f19c29be-d0ee-4c67-a8d4-340e1d97108f" containerName="oc" Mar 19 15:51:24 crc kubenswrapper[4771]: I0319 15:51:24.619926 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f19c29be-d0ee-4c67-a8d4-340e1d97108f" containerName="oc" Mar 19 15:51:24 crc kubenswrapper[4771]: I0319 15:51:24.622276 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-scb2s" Mar 19 15:51:24 crc kubenswrapper[4771]: I0319 15:51:24.644949 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-scb2s"] Mar 19 15:51:24 crc kubenswrapper[4771]: I0319 15:51:24.775079 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d975a0-5943-4643-a74f-6a5b35efd538-catalog-content\") pod \"redhat-marketplace-scb2s\" (UID: \"b3d975a0-5943-4643-a74f-6a5b35efd538\") " pod="openshift-marketplace/redhat-marketplace-scb2s" Mar 19 15:51:24 crc kubenswrapper[4771]: I0319 15:51:24.775359 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdhs2\" (UniqueName: \"kubernetes.io/projected/b3d975a0-5943-4643-a74f-6a5b35efd538-kube-api-access-fdhs2\") pod \"redhat-marketplace-scb2s\" (UID: \"b3d975a0-5943-4643-a74f-6a5b35efd538\") " pod="openshift-marketplace/redhat-marketplace-scb2s" Mar 19 15:51:24 crc kubenswrapper[4771]: I0319 15:51:24.775386 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d975a0-5943-4643-a74f-6a5b35efd538-utilities\") pod \"redhat-marketplace-scb2s\" (UID: \"b3d975a0-5943-4643-a74f-6a5b35efd538\") " pod="openshift-marketplace/redhat-marketplace-scb2s" Mar 19 15:51:24 crc kubenswrapper[4771]: I0319 15:51:24.877323 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d975a0-5943-4643-a74f-6a5b35efd538-catalog-content\") pod \"redhat-marketplace-scb2s\" (UID: \"b3d975a0-5943-4643-a74f-6a5b35efd538\") " pod="openshift-marketplace/redhat-marketplace-scb2s" Mar 19 15:51:24 crc kubenswrapper[4771]: I0319 15:51:24.877396 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdhs2\" (UniqueName: \"kubernetes.io/projected/b3d975a0-5943-4643-a74f-6a5b35efd538-kube-api-access-fdhs2\") pod \"redhat-marketplace-scb2s\" (UID: \"b3d975a0-5943-4643-a74f-6a5b35efd538\") " pod="openshift-marketplace/redhat-marketplace-scb2s" Mar 19 15:51:24 crc kubenswrapper[4771]: I0319 15:51:24.877430 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d975a0-5943-4643-a74f-6a5b35efd538-utilities\") pod \"redhat-marketplace-scb2s\" (UID: \"b3d975a0-5943-4643-a74f-6a5b35efd538\") " pod="openshift-marketplace/redhat-marketplace-scb2s" Mar 19 15:51:24 crc kubenswrapper[4771]: I0319 15:51:24.877978 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d975a0-5943-4643-a74f-6a5b35efd538-utilities\") pod \"redhat-marketplace-scb2s\" (UID: \"b3d975a0-5943-4643-a74f-6a5b35efd538\") " pod="openshift-marketplace/redhat-marketplace-scb2s" Mar 19 15:51:24 crc kubenswrapper[4771]: I0319 15:51:24.878464 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d975a0-5943-4643-a74f-6a5b35efd538-catalog-content\") pod \"redhat-marketplace-scb2s\" (UID: \"b3d975a0-5943-4643-a74f-6a5b35efd538\") " pod="openshift-marketplace/redhat-marketplace-scb2s" Mar 19 15:51:24 crc kubenswrapper[4771]: I0319 15:51:24.917846 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdhs2\" (UniqueName: \"kubernetes.io/projected/b3d975a0-5943-4643-a74f-6a5b35efd538-kube-api-access-fdhs2\") pod \"redhat-marketplace-scb2s\" (UID: \"b3d975a0-5943-4643-a74f-6a5b35efd538\") " pod="openshift-marketplace/redhat-marketplace-scb2s" Mar 19 15:51:24 crc kubenswrapper[4771]: I0319 15:51:24.968049 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-scb2s" Mar 19 15:51:25 crc kubenswrapper[4771]: I0319 15:51:25.429716 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-scb2s"] Mar 19 15:51:25 crc kubenswrapper[4771]: I0319 15:51:25.767821 4771 generic.go:334] "Generic (PLEG): container finished" podID="b3d975a0-5943-4643-a74f-6a5b35efd538" containerID="a0fd9cd60880b7a321b95c41d344151d5779c77324bbe0d9b69f0f8e7d663b97" exitCode=0 Mar 19 15:51:25 crc kubenswrapper[4771]: I0319 15:51:25.767916 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-scb2s" event={"ID":"b3d975a0-5943-4643-a74f-6a5b35efd538","Type":"ContainerDied","Data":"a0fd9cd60880b7a321b95c41d344151d5779c77324bbe0d9b69f0f8e7d663b97"} Mar 19 15:51:25 crc kubenswrapper[4771]: I0319 15:51:25.768157 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-scb2s" event={"ID":"b3d975a0-5943-4643-a74f-6a5b35efd538","Type":"ContainerStarted","Data":"fba345525180c52ac2721e9ec1b82b7cffe49169b3f755bec7e42cc8118f644f"} Mar 19 15:51:26 crc kubenswrapper[4771]: I0319 15:51:26.508980 4771 scope.go:117] "RemoveContainer" containerID="1ef12b655dd9ef435998cead844899fa42656fb2e2850f4f2ef645f3aa3016ce" Mar 19 15:51:26 crc kubenswrapper[4771]: E0319 15:51:26.509316 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:51:27 crc kubenswrapper[4771]: I0319 15:51:27.795176 4771 generic.go:334] "Generic (PLEG): container finished" podID="b3d975a0-5943-4643-a74f-6a5b35efd538" containerID="7ad221b6daf5c6a26374ef1fc71173aa02ec0dfe5dea58d746cf4a1e9d1c970e" exitCode=0 Mar 19 15:51:27 crc kubenswrapper[4771]: I0319 15:51:27.795350 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-scb2s" event={"ID":"b3d975a0-5943-4643-a74f-6a5b35efd538","Type":"ContainerDied","Data":"7ad221b6daf5c6a26374ef1fc71173aa02ec0dfe5dea58d746cf4a1e9d1c970e"} Mar 19 15:51:28 crc kubenswrapper[4771]: I0319 15:51:28.808770 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-scb2s" event={"ID":"b3d975a0-5943-4643-a74f-6a5b35efd538","Type":"ContainerStarted","Data":"79c85e690182f103d646126cecb3f783b84277025b79d15fb7f5b305f1bce7a2"} Mar 19 15:51:28 crc kubenswrapper[4771]: I0319 15:51:28.825979 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-scb2s" podStartSLOduration=2.271825275 podStartE2EDuration="4.825960261s" podCreationTimestamp="2026-03-19 15:51:24 +0000 UTC" firstStartedPulling="2026-03-19 15:51:25.769622088 +0000 UTC m=+2144.998243300" lastFinishedPulling="2026-03-19 15:51:28.323757084 +0000 UTC m=+2147.552378286" observedRunningTime="2026-03-19 15:51:28.82470026 +0000 UTC m=+2148.053321472" watchObservedRunningTime="2026-03-19 15:51:28.825960261 +0000 UTC m=+2148.054581463" Mar 19 15:51:30 crc kubenswrapper[4771]: I0319 15:51:30.185915 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-948kw"] Mar 19 15:51:30 crc kubenswrapper[4771]: I0319 15:51:30.188072 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-948kw" Mar 19 15:51:30 crc kubenswrapper[4771]: I0319 15:51:30.207159 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-948kw"] Mar 19 15:51:30 crc kubenswrapper[4771]: I0319 15:51:30.274668 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f4b1cce-3784-4c61-9110-f80f41742e0c-catalog-content\") pod \"redhat-operators-948kw\" (UID: \"5f4b1cce-3784-4c61-9110-f80f41742e0c\") " pod="openshift-marketplace/redhat-operators-948kw" Mar 19 15:51:30 crc kubenswrapper[4771]: I0319 15:51:30.274720 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f4b1cce-3784-4c61-9110-f80f41742e0c-utilities\") pod \"redhat-operators-948kw\" (UID: \"5f4b1cce-3784-4c61-9110-f80f41742e0c\") " pod="openshift-marketplace/redhat-operators-948kw" Mar 19 15:51:30 crc kubenswrapper[4771]: I0319 15:51:30.274744 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjghj\" (UniqueName: \"kubernetes.io/projected/5f4b1cce-3784-4c61-9110-f80f41742e0c-kube-api-access-cjghj\") pod \"redhat-operators-948kw\" (UID: \"5f4b1cce-3784-4c61-9110-f80f41742e0c\") " pod="openshift-marketplace/redhat-operators-948kw" Mar 19 15:51:30 crc kubenswrapper[4771]: I0319 15:51:30.375580 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f4b1cce-3784-4c61-9110-f80f41742e0c-utilities\") pod \"redhat-operators-948kw\" (UID: \"5f4b1cce-3784-4c61-9110-f80f41742e0c\") " pod="openshift-marketplace/redhat-operators-948kw" Mar 19 15:51:30 crc kubenswrapper[4771]: I0319 15:51:30.375634 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjghj\" (UniqueName: \"kubernetes.io/projected/5f4b1cce-3784-4c61-9110-f80f41742e0c-kube-api-access-cjghj\") pod \"redhat-operators-948kw\" (UID: \"5f4b1cce-3784-4c61-9110-f80f41742e0c\") " pod="openshift-marketplace/redhat-operators-948kw" Mar 19 15:51:30 crc kubenswrapper[4771]: I0319 15:51:30.375788 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f4b1cce-3784-4c61-9110-f80f41742e0c-catalog-content\") pod \"redhat-operators-948kw\" (UID: \"5f4b1cce-3784-4c61-9110-f80f41742e0c\") " pod="openshift-marketplace/redhat-operators-948kw" Mar 19 15:51:30 crc kubenswrapper[4771]: I0319 15:51:30.376510 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f4b1cce-3784-4c61-9110-f80f41742e0c-catalog-content\") pod \"redhat-operators-948kw\" (UID: \"5f4b1cce-3784-4c61-9110-f80f41742e0c\") " pod="openshift-marketplace/redhat-operators-948kw" Mar 19 15:51:30 crc kubenswrapper[4771]: I0319 15:51:30.376514 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f4b1cce-3784-4c61-9110-f80f41742e0c-utilities\") pod \"redhat-operators-948kw\" (UID: \"5f4b1cce-3784-4c61-9110-f80f41742e0c\") " pod="openshift-marketplace/redhat-operators-948kw" Mar 19 15:51:30 crc kubenswrapper[4771]: I0319 15:51:30.398333 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjghj\" (UniqueName: \"kubernetes.io/projected/5f4b1cce-3784-4c61-9110-f80f41742e0c-kube-api-access-cjghj\") pod \"redhat-operators-948kw\" (UID: \"5f4b1cce-3784-4c61-9110-f80f41742e0c\") " pod="openshift-marketplace/redhat-operators-948kw" Mar 19 15:51:30 crc kubenswrapper[4771]: I0319 15:51:30.504850 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-948kw" Mar 19 15:51:30 crc kubenswrapper[4771]: W0319 15:51:30.993535 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f4b1cce_3784_4c61_9110_f80f41742e0c.slice/crio-c75f680c785179f561f1956fb6bc9f448a08ca8421f9a808c18b2e1380977a25 WatchSource:0}: Error finding container c75f680c785179f561f1956fb6bc9f448a08ca8421f9a808c18b2e1380977a25: Status 404 returned error can't find the container with id c75f680c785179f561f1956fb6bc9f448a08ca8421f9a808c18b2e1380977a25 Mar 19 15:51:30 crc kubenswrapper[4771]: I0319 15:51:30.994642 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-948kw"] Mar 19 15:51:31 crc kubenswrapper[4771]: I0319 15:51:31.829556 4771 generic.go:334] "Generic (PLEG): container finished" podID="5f4b1cce-3784-4c61-9110-f80f41742e0c" containerID="744c579aaf7532cb77fe78b78cc3ccf54ec4b48aeae9becfa4054371450919c5" exitCode=0 Mar 19 15:51:31 crc kubenswrapper[4771]: I0319 15:51:31.829645 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-948kw" event={"ID":"5f4b1cce-3784-4c61-9110-f80f41742e0c","Type":"ContainerDied","Data":"744c579aaf7532cb77fe78b78cc3ccf54ec4b48aeae9becfa4054371450919c5"} Mar 19 15:51:31 crc kubenswrapper[4771]: I0319 15:51:31.830802 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-948kw" event={"ID":"5f4b1cce-3784-4c61-9110-f80f41742e0c","Type":"ContainerStarted","Data":"c75f680c785179f561f1956fb6bc9f448a08ca8421f9a808c18b2e1380977a25"} Mar 19 15:51:31 crc kubenswrapper[4771]: I0319 15:51:31.977008 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n7cxt"] Mar 19 15:51:31 crc kubenswrapper[4771]: I0319 15:51:31.979142 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n7cxt" Mar 19 15:51:31 crc kubenswrapper[4771]: I0319 15:51:31.988609 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n7cxt"] Mar 19 15:51:32 crc kubenswrapper[4771]: I0319 15:51:32.107698 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3443197-32ed-4688-85dc-c0e8242fb4e5-catalog-content\") pod \"community-operators-n7cxt\" (UID: \"a3443197-32ed-4688-85dc-c0e8242fb4e5\") " pod="openshift-marketplace/community-operators-n7cxt" Mar 19 15:51:32 crc kubenswrapper[4771]: I0319 15:51:32.107780 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9b67\" (UniqueName: \"kubernetes.io/projected/a3443197-32ed-4688-85dc-c0e8242fb4e5-kube-api-access-n9b67\") pod \"community-operators-n7cxt\" (UID: \"a3443197-32ed-4688-85dc-c0e8242fb4e5\") " pod="openshift-marketplace/community-operators-n7cxt" Mar 19 15:51:32 crc kubenswrapper[4771]: I0319 15:51:32.107874 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3443197-32ed-4688-85dc-c0e8242fb4e5-utilities\") pod \"community-operators-n7cxt\" (UID: \"a3443197-32ed-4688-85dc-c0e8242fb4e5\") " pod="openshift-marketplace/community-operators-n7cxt" Mar 19 15:51:32 crc kubenswrapper[4771]: I0319 15:51:32.209964 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3443197-32ed-4688-85dc-c0e8242fb4e5-catalog-content\") pod \"community-operators-n7cxt\" (UID: \"a3443197-32ed-4688-85dc-c0e8242fb4e5\") " pod="openshift-marketplace/community-operators-n7cxt" Mar 19 15:51:32 crc kubenswrapper[4771]: I0319 15:51:32.210177 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9b67\" (UniqueName: \"kubernetes.io/projected/a3443197-32ed-4688-85dc-c0e8242fb4e5-kube-api-access-n9b67\") pod \"community-operators-n7cxt\" (UID: \"a3443197-32ed-4688-85dc-c0e8242fb4e5\") " pod="openshift-marketplace/community-operators-n7cxt" Mar 19 15:51:32 crc kubenswrapper[4771]: I0319 15:51:32.210307 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3443197-32ed-4688-85dc-c0e8242fb4e5-utilities\") pod \"community-operators-n7cxt\" (UID: \"a3443197-32ed-4688-85dc-c0e8242fb4e5\") " pod="openshift-marketplace/community-operators-n7cxt" Mar 19 15:51:32 crc kubenswrapper[4771]: I0319 15:51:32.210532 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3443197-32ed-4688-85dc-c0e8242fb4e5-catalog-content\") pod \"community-operators-n7cxt\" (UID: \"a3443197-32ed-4688-85dc-c0e8242fb4e5\") " pod="openshift-marketplace/community-operators-n7cxt" Mar 19 15:51:32 crc kubenswrapper[4771]: I0319 15:51:32.210896 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3443197-32ed-4688-85dc-c0e8242fb4e5-utilities\") pod \"community-operators-n7cxt\" (UID: \"a3443197-32ed-4688-85dc-c0e8242fb4e5\") " pod="openshift-marketplace/community-operators-n7cxt" Mar 19 15:51:32 crc kubenswrapper[4771]: I0319 15:51:32.234295 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9b67\" (UniqueName: \"kubernetes.io/projected/a3443197-32ed-4688-85dc-c0e8242fb4e5-kube-api-access-n9b67\") pod \"community-operators-n7cxt\" (UID: \"a3443197-32ed-4688-85dc-c0e8242fb4e5\") " pod="openshift-marketplace/community-operators-n7cxt" Mar 19 15:51:32 crc kubenswrapper[4771]: I0319 15:51:32.310967 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n7cxt" Mar 19 15:51:32 crc kubenswrapper[4771]: I0319 15:51:32.873404 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n7cxt"] Mar 19 15:51:32 crc kubenswrapper[4771]: W0319 15:51:32.890191 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3443197_32ed_4688_85dc_c0e8242fb4e5.slice/crio-7945ad1cadcf6e8593bbe62b08d95a1d8ce0a1365209893869889a110d39ec2e WatchSource:0}: Error finding container 7945ad1cadcf6e8593bbe62b08d95a1d8ce0a1365209893869889a110d39ec2e: Status 404 returned error can't find the container with id 7945ad1cadcf6e8593bbe62b08d95a1d8ce0a1365209893869889a110d39ec2e Mar 19 15:51:33 crc kubenswrapper[4771]: I0319 15:51:33.847958 4771 generic.go:334] "Generic (PLEG): container finished" podID="5f4b1cce-3784-4c61-9110-f80f41742e0c" containerID="f0251999d73dc7247008e4c28fa0e57c3724498cc39571a379426b6b3abe81c4" exitCode=0 Mar 19 15:51:33 crc kubenswrapper[4771]: I0319 15:51:33.848028 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-948kw" event={"ID":"5f4b1cce-3784-4c61-9110-f80f41742e0c","Type":"ContainerDied","Data":"f0251999d73dc7247008e4c28fa0e57c3724498cc39571a379426b6b3abe81c4"} Mar 19 15:51:33 crc kubenswrapper[4771]: I0319 15:51:33.850420 4771 generic.go:334] "Generic (PLEG): container finished" podID="a3443197-32ed-4688-85dc-c0e8242fb4e5" containerID="10b8bb6779a8a2d2e861a865878b185d71ba4a54ecda2d8c900ef7b80488e1c6" exitCode=0 Mar 19 15:51:33 crc kubenswrapper[4771]: I0319 15:51:33.850524 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7cxt" event={"ID":"a3443197-32ed-4688-85dc-c0e8242fb4e5","Type":"ContainerDied","Data":"10b8bb6779a8a2d2e861a865878b185d71ba4a54ecda2d8c900ef7b80488e1c6"} Mar 19 15:51:33 crc kubenswrapper[4771]: I0319 15:51:33.850596 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7cxt" event={"ID":"a3443197-32ed-4688-85dc-c0e8242fb4e5","Type":"ContainerStarted","Data":"7945ad1cadcf6e8593bbe62b08d95a1d8ce0a1365209893869889a110d39ec2e"} Mar 19 15:51:34 crc kubenswrapper[4771]: I0319 15:51:34.861490 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-948kw" event={"ID":"5f4b1cce-3784-4c61-9110-f80f41742e0c","Type":"ContainerStarted","Data":"fa735866c81cd0dd3ae2808fe898bd519cf1038c03e7c9563dabc2d4d9ed34a5"} Mar 19 15:51:34 crc kubenswrapper[4771]: I0319 15:51:34.867634 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7cxt" event={"ID":"a3443197-32ed-4688-85dc-c0e8242fb4e5","Type":"ContainerStarted","Data":"0195da8d9497cbb3431f0dd28866edc22ade07ccd34bd0f3629f488b9d00cd87"} Mar 19 15:51:34 crc kubenswrapper[4771]: I0319 15:51:34.911805 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-948kw" podStartSLOduration=2.5360709200000002 podStartE2EDuration="4.911788969s" podCreationTimestamp="2026-03-19 15:51:30 +0000 UTC" firstStartedPulling="2026-03-19 15:51:31.831228223 +0000 UTC m=+2151.059849425" lastFinishedPulling="2026-03-19 15:51:34.206946272 +0000 UTC m=+2153.435567474" observedRunningTime="2026-03-19 15:51:34.90931117 +0000 UTC m=+2154.137932382" watchObservedRunningTime="2026-03-19 15:51:34.911788969 +0000 UTC m=+2154.140410171" Mar 19 15:51:34 crc kubenswrapper[4771]: I0319 15:51:34.968417 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-scb2s" Mar 19 15:51:34 crc kubenswrapper[4771]: I0319 15:51:34.984071 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-scb2s" Mar 19 15:51:35 crc kubenswrapper[4771]: I0319 15:51:35.070560 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-scb2s" Mar 19 15:51:35 crc kubenswrapper[4771]: I0319 15:51:35.509435 4771 scope.go:117] "RemoveContainer" containerID="b679eb3a4980c447710e4dcc79c2ab79590325c537c0399ecfa57c365bb2ecc3" Mar 19 15:51:35 crc kubenswrapper[4771]: E0319 15:51:35.509668 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:51:35 crc kubenswrapper[4771]: I0319 15:51:35.932043 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-scb2s" Mar 19 15:51:37 crc kubenswrapper[4771]: I0319 15:51:37.895404 4771 generic.go:334] "Generic (PLEG): container finished" podID="a3443197-32ed-4688-85dc-c0e8242fb4e5" containerID="0195da8d9497cbb3431f0dd28866edc22ade07ccd34bd0f3629f488b9d00cd87" exitCode=0 Mar 19 15:51:37 crc kubenswrapper[4771]: I0319 15:51:37.896307 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7cxt" event={"ID":"a3443197-32ed-4688-85dc-c0e8242fb4e5","Type":"ContainerDied","Data":"0195da8d9497cbb3431f0dd28866edc22ade07ccd34bd0f3629f488b9d00cd87"} Mar 19 15:51:38 crc kubenswrapper[4771]: I0319 15:51:38.508478 4771 scope.go:117] "RemoveContainer" containerID="1ef12b655dd9ef435998cead844899fa42656fb2e2850f4f2ef645f3aa3016ce" Mar 19 15:51:38 crc kubenswrapper[4771]: E0319 15:51:38.508696 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:51:38 crc kubenswrapper[4771]: I0319 15:51:38.593458 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-scb2s"] Mar 19 15:51:38 crc kubenswrapper[4771]: I0319 15:51:38.913606 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7cxt" event={"ID":"a3443197-32ed-4688-85dc-c0e8242fb4e5","Type":"ContainerStarted","Data":"b301e9ca17f1d727b5be1139f8b63bd918c8b71db8de883ffa949f4c4073eaf5"} Mar 19 15:51:38 crc kubenswrapper[4771]: I0319 15:51:38.913958 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-scb2s" podUID="b3d975a0-5943-4643-a74f-6a5b35efd538" containerName="registry-server" containerID="cri-o://79c85e690182f103d646126cecb3f783b84277025b79d15fb7f5b305f1bce7a2" gracePeriod=2 Mar 19 15:51:38 crc kubenswrapper[4771]: I0319 15:51:38.945787 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n7cxt" podStartSLOduration=3.174755973 podStartE2EDuration="7.94577027s" podCreationTimestamp="2026-03-19 15:51:31 +0000 UTC" firstStartedPulling="2026-03-19 15:51:33.852078263 +0000 UTC m=+2153.080699465" lastFinishedPulling="2026-03-19 15:51:38.62309256 +0000 UTC m=+2157.851713762" observedRunningTime="2026-03-19 15:51:38.939050856 +0000 UTC m=+2158.167672058" watchObservedRunningTime="2026-03-19 15:51:38.94577027 +0000 UTC m=+2158.174391472" Mar 19 15:51:39 crc kubenswrapper[4771]: I0319 15:51:39.317524 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-scb2s" Mar 19 15:51:39 crc kubenswrapper[4771]: I0319 15:51:39.419530 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d975a0-5943-4643-a74f-6a5b35efd538-utilities\") pod \"b3d975a0-5943-4643-a74f-6a5b35efd538\" (UID: \"b3d975a0-5943-4643-a74f-6a5b35efd538\") " Mar 19 15:51:39 crc kubenswrapper[4771]: I0319 15:51:39.419711 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdhs2\" (UniqueName: \"kubernetes.io/projected/b3d975a0-5943-4643-a74f-6a5b35efd538-kube-api-access-fdhs2\") pod \"b3d975a0-5943-4643-a74f-6a5b35efd538\" (UID: \"b3d975a0-5943-4643-a74f-6a5b35efd538\") " Mar 19 15:51:39 crc kubenswrapper[4771]: I0319 15:51:39.419778 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d975a0-5943-4643-a74f-6a5b35efd538-catalog-content\") pod \"b3d975a0-5943-4643-a74f-6a5b35efd538\" (UID: \"b3d975a0-5943-4643-a74f-6a5b35efd538\") " Mar 19 15:51:39 crc kubenswrapper[4771]: I0319 15:51:39.420490 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3d975a0-5943-4643-a74f-6a5b35efd538-utilities" (OuterVolumeSpecName: "utilities") pod "b3d975a0-5943-4643-a74f-6a5b35efd538" (UID: "b3d975a0-5943-4643-a74f-6a5b35efd538"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:51:39 crc kubenswrapper[4771]: I0319 15:51:39.448034 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3d975a0-5943-4643-a74f-6a5b35efd538-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3d975a0-5943-4643-a74f-6a5b35efd538" (UID: "b3d975a0-5943-4643-a74f-6a5b35efd538"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:51:39 crc kubenswrapper[4771]: I0319 15:51:39.465014 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3d975a0-5943-4643-a74f-6a5b35efd538-kube-api-access-fdhs2" (OuterVolumeSpecName: "kube-api-access-fdhs2") pod "b3d975a0-5943-4643-a74f-6a5b35efd538" (UID: "b3d975a0-5943-4643-a74f-6a5b35efd538"). InnerVolumeSpecName "kube-api-access-fdhs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:51:39 crc kubenswrapper[4771]: I0319 15:51:39.521317 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d975a0-5943-4643-a74f-6a5b35efd538-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 15:51:39 crc kubenswrapper[4771]: I0319 15:51:39.521350 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d975a0-5943-4643-a74f-6a5b35efd538-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 15:51:39 crc kubenswrapper[4771]: I0319 15:51:39.521364 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdhs2\" (UniqueName: \"kubernetes.io/projected/b3d975a0-5943-4643-a74f-6a5b35efd538-kube-api-access-fdhs2\") on node \"crc\" DevicePath \"\"" Mar 19 15:51:39 crc kubenswrapper[4771]: I0319 15:51:39.922751 4771 generic.go:334] "Generic (PLEG): container finished" podID="b3d975a0-5943-4643-a74f-6a5b35efd538" containerID="79c85e690182f103d646126cecb3f783b84277025b79d15fb7f5b305f1bce7a2" exitCode=0 Mar 19 15:51:39 crc kubenswrapper[4771]: I0319 15:51:39.922803 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-scb2s" Mar 19 15:51:39 crc kubenswrapper[4771]: I0319 15:51:39.922811 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-scb2s" event={"ID":"b3d975a0-5943-4643-a74f-6a5b35efd538","Type":"ContainerDied","Data":"79c85e690182f103d646126cecb3f783b84277025b79d15fb7f5b305f1bce7a2"} Mar 19 15:51:39 crc kubenswrapper[4771]: I0319 15:51:39.923258 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-scb2s" event={"ID":"b3d975a0-5943-4643-a74f-6a5b35efd538","Type":"ContainerDied","Data":"fba345525180c52ac2721e9ec1b82b7cffe49169b3f755bec7e42cc8118f644f"} Mar 19 15:51:39 crc kubenswrapper[4771]: I0319 15:51:39.923280 4771 scope.go:117] "RemoveContainer" containerID="79c85e690182f103d646126cecb3f783b84277025b79d15fb7f5b305f1bce7a2" Mar 19 15:51:39 crc kubenswrapper[4771]: I0319 15:51:39.945325 4771 scope.go:117] "RemoveContainer" containerID="7ad221b6daf5c6a26374ef1fc71173aa02ec0dfe5dea58d746cf4a1e9d1c970e" Mar 19 15:51:39 crc kubenswrapper[4771]: I0319 15:51:39.947189 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-scb2s"] Mar 19 15:51:39 crc kubenswrapper[4771]: I0319 15:51:39.956086 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-scb2s"] Mar 19 15:51:39 crc kubenswrapper[4771]: I0319 15:51:39.967485 4771 scope.go:117] "RemoveContainer" containerID="a0fd9cd60880b7a321b95c41d344151d5779c77324bbe0d9b69f0f8e7d663b97" Mar 19 15:51:40 crc kubenswrapper[4771]: I0319 15:51:40.021495 4771 scope.go:117] "RemoveContainer" containerID="79c85e690182f103d646126cecb3f783b84277025b79d15fb7f5b305f1bce7a2" Mar 19 15:51:40 crc kubenswrapper[4771]: E0319 15:51:40.022055 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79c85e690182f103d646126cecb3f783b84277025b79d15fb7f5b305f1bce7a2\": container with ID starting with 79c85e690182f103d646126cecb3f783b84277025b79d15fb7f5b305f1bce7a2 not found: ID does not exist" containerID="79c85e690182f103d646126cecb3f783b84277025b79d15fb7f5b305f1bce7a2" Mar 19 15:51:40 crc kubenswrapper[4771]: I0319 15:51:40.022103 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79c85e690182f103d646126cecb3f783b84277025b79d15fb7f5b305f1bce7a2"} err="failed to get container status \"79c85e690182f103d646126cecb3f783b84277025b79d15fb7f5b305f1bce7a2\": rpc error: code = NotFound desc = could not find container \"79c85e690182f103d646126cecb3f783b84277025b79d15fb7f5b305f1bce7a2\": container with ID starting with 79c85e690182f103d646126cecb3f783b84277025b79d15fb7f5b305f1bce7a2 not found: ID does not exist" Mar 19 15:51:40 crc kubenswrapper[4771]: I0319 15:51:40.022129 4771 scope.go:117] "RemoveContainer" containerID="7ad221b6daf5c6a26374ef1fc71173aa02ec0dfe5dea58d746cf4a1e9d1c970e" Mar 19 15:51:40 crc kubenswrapper[4771]: E0319 15:51:40.022442 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ad221b6daf5c6a26374ef1fc71173aa02ec0dfe5dea58d746cf4a1e9d1c970e\": container with ID starting with 7ad221b6daf5c6a26374ef1fc71173aa02ec0dfe5dea58d746cf4a1e9d1c970e not found: ID does not exist" containerID="7ad221b6daf5c6a26374ef1fc71173aa02ec0dfe5dea58d746cf4a1e9d1c970e" Mar 19 15:51:40 crc kubenswrapper[4771]: I0319 15:51:40.022475 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ad221b6daf5c6a26374ef1fc71173aa02ec0dfe5dea58d746cf4a1e9d1c970e"} err="failed to get container status \"7ad221b6daf5c6a26374ef1fc71173aa02ec0dfe5dea58d746cf4a1e9d1c970e\": rpc error: code = NotFound desc = could not find container \"7ad221b6daf5c6a26374ef1fc71173aa02ec0dfe5dea58d746cf4a1e9d1c970e\": container with ID starting with 7ad221b6daf5c6a26374ef1fc71173aa02ec0dfe5dea58d746cf4a1e9d1c970e not found: ID does not exist" Mar 19 15:51:40 crc kubenswrapper[4771]: I0319 15:51:40.022493 4771 scope.go:117] "RemoveContainer" containerID="a0fd9cd60880b7a321b95c41d344151d5779c77324bbe0d9b69f0f8e7d663b97" Mar 19 15:51:40 crc kubenswrapper[4771]: E0319 15:51:40.022764 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0fd9cd60880b7a321b95c41d344151d5779c77324bbe0d9b69f0f8e7d663b97\": container with ID starting with a0fd9cd60880b7a321b95c41d344151d5779c77324bbe0d9b69f0f8e7d663b97 not found: ID does not exist" containerID="a0fd9cd60880b7a321b95c41d344151d5779c77324bbe0d9b69f0f8e7d663b97" Mar 19 15:51:40 crc kubenswrapper[4771]: I0319 15:51:40.022802 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0fd9cd60880b7a321b95c41d344151d5779c77324bbe0d9b69f0f8e7d663b97"} err="failed to get container status \"a0fd9cd60880b7a321b95c41d344151d5779c77324bbe0d9b69f0f8e7d663b97\": rpc error: code = NotFound desc = could not find container \"a0fd9cd60880b7a321b95c41d344151d5779c77324bbe0d9b69f0f8e7d663b97\": container with ID starting with a0fd9cd60880b7a321b95c41d344151d5779c77324bbe0d9b69f0f8e7d663b97 not found: ID does not exist" Mar 19 15:51:40 crc kubenswrapper[4771]: I0319 15:51:40.505908 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-948kw" Mar 19 15:51:40 crc kubenswrapper[4771]: I0319 15:51:40.505973 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-948kw" Mar 19 15:51:41 crc kubenswrapper[4771]: I0319 15:51:41.525822 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3d975a0-5943-4643-a74f-6a5b35efd538" path="/var/lib/kubelet/pods/b3d975a0-5943-4643-a74f-6a5b35efd538/volumes" Mar 19 15:51:41 crc kubenswrapper[4771]: I0319 15:51:41.561756 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-948kw" podUID="5f4b1cce-3784-4c61-9110-f80f41742e0c" containerName="registry-server" probeResult="failure" output=< Mar 19 15:51:41 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Mar 19 15:51:41 crc kubenswrapper[4771]: > Mar 19 15:51:42 crc kubenswrapper[4771]: I0319 15:51:42.311655 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n7cxt" Mar 19 15:51:42 crc kubenswrapper[4771]: I0319 15:51:42.311903 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n7cxt" Mar 19 15:51:42 crc kubenswrapper[4771]: I0319 15:51:42.389132 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n7cxt" Mar 19 15:51:47 crc kubenswrapper[4771]: I0319 15:51:47.508780 4771 scope.go:117] "RemoveContainer" containerID="b679eb3a4980c447710e4dcc79c2ab79590325c537c0399ecfa57c365bb2ecc3" Mar 19 15:51:47 crc kubenswrapper[4771]: E0319 15:51:47.509568 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:51:50 crc kubenswrapper[4771]: I0319 15:51:50.573551 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-948kw" Mar 19 15:51:50 crc kubenswrapper[4771]: I0319 15:51:50.628974 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-948kw" Mar 19 15:51:50 crc kubenswrapper[4771]: I0319 15:51:50.817406 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-948kw"] Mar 19 15:51:52 crc kubenswrapper[4771]: I0319 15:51:52.043642 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-948kw" podUID="5f4b1cce-3784-4c61-9110-f80f41742e0c" containerName="registry-server" containerID="cri-o://fa735866c81cd0dd3ae2808fe898bd519cf1038c03e7c9563dabc2d4d9ed34a5" gracePeriod=2 Mar 19 15:51:52 crc kubenswrapper[4771]: I0319 15:51:52.402781 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n7cxt" Mar 19 15:51:52 crc kubenswrapper[4771]: I0319 15:51:52.540003 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-948kw" Mar 19 15:51:52 crc kubenswrapper[4771]: I0319 15:51:52.668154 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f4b1cce-3784-4c61-9110-f80f41742e0c-utilities\") pod \"5f4b1cce-3784-4c61-9110-f80f41742e0c\" (UID: \"5f4b1cce-3784-4c61-9110-f80f41742e0c\") " Mar 19 15:51:52 crc kubenswrapper[4771]: I0319 15:51:52.668194 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjghj\" (UniqueName: \"kubernetes.io/projected/5f4b1cce-3784-4c61-9110-f80f41742e0c-kube-api-access-cjghj\") pod \"5f4b1cce-3784-4c61-9110-f80f41742e0c\" (UID: \"5f4b1cce-3784-4c61-9110-f80f41742e0c\") " Mar 19 15:51:52 crc kubenswrapper[4771]: I0319 15:51:52.668354 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f4b1cce-3784-4c61-9110-f80f41742e0c-catalog-content\") pod \"5f4b1cce-3784-4c61-9110-f80f41742e0c\" (UID: \"5f4b1cce-3784-4c61-9110-f80f41742e0c\") " Mar 19 15:51:52 crc kubenswrapper[4771]: I0319 15:51:52.668844 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f4b1cce-3784-4c61-9110-f80f41742e0c-utilities" (OuterVolumeSpecName: "utilities") pod "5f4b1cce-3784-4c61-9110-f80f41742e0c" (UID: "5f4b1cce-3784-4c61-9110-f80f41742e0c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:51:52 crc kubenswrapper[4771]: I0319 15:51:52.674702 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f4b1cce-3784-4c61-9110-f80f41742e0c-kube-api-access-cjghj" (OuterVolumeSpecName: "kube-api-access-cjghj") pod "5f4b1cce-3784-4c61-9110-f80f41742e0c" (UID: "5f4b1cce-3784-4c61-9110-f80f41742e0c"). InnerVolumeSpecName "kube-api-access-cjghj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:51:52 crc kubenswrapper[4771]: I0319 15:51:52.769969 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f4b1cce-3784-4c61-9110-f80f41742e0c-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 15:51:52 crc kubenswrapper[4771]: I0319 15:51:52.770016 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjghj\" (UniqueName: \"kubernetes.io/projected/5f4b1cce-3784-4c61-9110-f80f41742e0c-kube-api-access-cjghj\") on node \"crc\" DevicePath \"\"" Mar 19 15:51:52 crc kubenswrapper[4771]: I0319 15:51:52.855349 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f4b1cce-3784-4c61-9110-f80f41742e0c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5f4b1cce-3784-4c61-9110-f80f41742e0c" (UID: "5f4b1cce-3784-4c61-9110-f80f41742e0c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:51:52 crc kubenswrapper[4771]: I0319 15:51:52.871232 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f4b1cce-3784-4c61-9110-f80f41742e0c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 15:51:53 crc kubenswrapper[4771]: I0319 15:51:53.053133 4771 generic.go:334] "Generic (PLEG): container finished" podID="5f4b1cce-3784-4c61-9110-f80f41742e0c" containerID="fa735866c81cd0dd3ae2808fe898bd519cf1038c03e7c9563dabc2d4d9ed34a5" exitCode=0 Mar 19 15:51:53 crc kubenswrapper[4771]: I0319 15:51:53.053186 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-948kw" event={"ID":"5f4b1cce-3784-4c61-9110-f80f41742e0c","Type":"ContainerDied","Data":"fa735866c81cd0dd3ae2808fe898bd519cf1038c03e7c9563dabc2d4d9ed34a5"} Mar 19 15:51:53 crc kubenswrapper[4771]: I0319 15:51:53.053217 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-948kw" event={"ID":"5f4b1cce-3784-4c61-9110-f80f41742e0c","Type":"ContainerDied","Data":"c75f680c785179f561f1956fb6bc9f448a08ca8421f9a808c18b2e1380977a25"} Mar 19 15:51:53 crc kubenswrapper[4771]: I0319 15:51:53.053239 4771 scope.go:117] "RemoveContainer" containerID="fa735866c81cd0dd3ae2808fe898bd519cf1038c03e7c9563dabc2d4d9ed34a5" Mar 19 15:51:53 crc kubenswrapper[4771]: I0319 15:51:53.053263 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-948kw" Mar 19 15:51:53 crc kubenswrapper[4771]: I0319 15:51:53.071292 4771 scope.go:117] "RemoveContainer" containerID="f0251999d73dc7247008e4c28fa0e57c3724498cc39571a379426b6b3abe81c4" Mar 19 15:51:53 crc kubenswrapper[4771]: I0319 15:51:53.095596 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-948kw"] Mar 19 15:51:53 crc kubenswrapper[4771]: I0319 15:51:53.101860 4771 scope.go:117] "RemoveContainer" containerID="744c579aaf7532cb77fe78b78cc3ccf54ec4b48aeae9becfa4054371450919c5" Mar 19 15:51:53 crc kubenswrapper[4771]: I0319 15:51:53.107536 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-948kw"] Mar 19 15:51:53 crc kubenswrapper[4771]: I0319 15:51:53.125851 4771 scope.go:117] "RemoveContainer" containerID="fa735866c81cd0dd3ae2808fe898bd519cf1038c03e7c9563dabc2d4d9ed34a5" Mar 19 15:51:53 crc kubenswrapper[4771]: E0319 15:51:53.126227 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa735866c81cd0dd3ae2808fe898bd519cf1038c03e7c9563dabc2d4d9ed34a5\": container with ID starting with fa735866c81cd0dd3ae2808fe898bd519cf1038c03e7c9563dabc2d4d9ed34a5 not found: ID does not exist" containerID="fa735866c81cd0dd3ae2808fe898bd519cf1038c03e7c9563dabc2d4d9ed34a5" Mar 19 15:51:53 crc kubenswrapper[4771]: I0319 15:51:53.126270 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa735866c81cd0dd3ae2808fe898bd519cf1038c03e7c9563dabc2d4d9ed34a5"} err="failed to get container status \"fa735866c81cd0dd3ae2808fe898bd519cf1038c03e7c9563dabc2d4d9ed34a5\": rpc error: code = NotFound desc = could not find container \"fa735866c81cd0dd3ae2808fe898bd519cf1038c03e7c9563dabc2d4d9ed34a5\": container with ID starting with fa735866c81cd0dd3ae2808fe898bd519cf1038c03e7c9563dabc2d4d9ed34a5 not found: ID does not exist" Mar 19 15:51:53 crc kubenswrapper[4771]: I0319 15:51:53.126298 4771 scope.go:117] "RemoveContainer" containerID="f0251999d73dc7247008e4c28fa0e57c3724498cc39571a379426b6b3abe81c4" Mar 19 15:51:53 crc kubenswrapper[4771]: E0319 15:51:53.126751 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0251999d73dc7247008e4c28fa0e57c3724498cc39571a379426b6b3abe81c4\": container with ID starting with f0251999d73dc7247008e4c28fa0e57c3724498cc39571a379426b6b3abe81c4 not found: ID does not exist" containerID="f0251999d73dc7247008e4c28fa0e57c3724498cc39571a379426b6b3abe81c4" Mar 19 15:51:53 crc kubenswrapper[4771]: I0319 15:51:53.126781 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0251999d73dc7247008e4c28fa0e57c3724498cc39571a379426b6b3abe81c4"} err="failed to get container status \"f0251999d73dc7247008e4c28fa0e57c3724498cc39571a379426b6b3abe81c4\": rpc error: code = NotFound desc = could not find container \"f0251999d73dc7247008e4c28fa0e57c3724498cc39571a379426b6b3abe81c4\": container with ID starting with f0251999d73dc7247008e4c28fa0e57c3724498cc39571a379426b6b3abe81c4 not found: ID does not exist" Mar 19 15:51:53 crc kubenswrapper[4771]: I0319 15:51:53.126802 4771 scope.go:117] "RemoveContainer" containerID="744c579aaf7532cb77fe78b78cc3ccf54ec4b48aeae9becfa4054371450919c5" Mar 19 15:51:53 crc kubenswrapper[4771]: E0319 15:51:53.127040 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"744c579aaf7532cb77fe78b78cc3ccf54ec4b48aeae9becfa4054371450919c5\": container with ID starting with 744c579aaf7532cb77fe78b78cc3ccf54ec4b48aeae9becfa4054371450919c5 not found: ID does not exist" containerID="744c579aaf7532cb77fe78b78cc3ccf54ec4b48aeae9becfa4054371450919c5" Mar 19 15:51:53 crc kubenswrapper[4771]: I0319 15:51:53.127072 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"744c579aaf7532cb77fe78b78cc3ccf54ec4b48aeae9becfa4054371450919c5"} err="failed to get container status \"744c579aaf7532cb77fe78b78cc3ccf54ec4b48aeae9becfa4054371450919c5\": rpc error: code = NotFound desc = could not find container \"744c579aaf7532cb77fe78b78cc3ccf54ec4b48aeae9becfa4054371450919c5\": container with ID starting with 744c579aaf7532cb77fe78b78cc3ccf54ec4b48aeae9becfa4054371450919c5 not found: ID does not exist" Mar 19 15:51:53 crc kubenswrapper[4771]: I0319 15:51:53.508947 4771 scope.go:117] "RemoveContainer" containerID="1ef12b655dd9ef435998cead844899fa42656fb2e2850f4f2ef645f3aa3016ce" Mar 19 15:51:53 crc kubenswrapper[4771]: E0319 15:51:53.509464 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:51:53 crc kubenswrapper[4771]: I0319 15:51:53.520467 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f4b1cce-3784-4c61-9110-f80f41742e0c" path="/var/lib/kubelet/pods/5f4b1cce-3784-4c61-9110-f80f41742e0c/volumes" Mar 19 15:51:53 crc kubenswrapper[4771]: I0319 15:51:53.826876 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n7cxt"] Mar 19 15:51:53 crc kubenswrapper[4771]: I0319 15:51:53.828052 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n7cxt" podUID="a3443197-32ed-4688-85dc-c0e8242fb4e5" containerName="registry-server" containerID="cri-o://b301e9ca17f1d727b5be1139f8b63bd918c8b71db8de883ffa949f4c4073eaf5" gracePeriod=2 Mar 19 15:51:54 crc kubenswrapper[4771]: I0319 15:51:54.078432 4771 generic.go:334] "Generic (PLEG): container finished" podID="a3443197-32ed-4688-85dc-c0e8242fb4e5" containerID="b301e9ca17f1d727b5be1139f8b63bd918c8b71db8de883ffa949f4c4073eaf5" exitCode=0 Mar 19 15:51:54 crc kubenswrapper[4771]: I0319 15:51:54.078484 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7cxt" event={"ID":"a3443197-32ed-4688-85dc-c0e8242fb4e5","Type":"ContainerDied","Data":"b301e9ca17f1d727b5be1139f8b63bd918c8b71db8de883ffa949f4c4073eaf5"} Mar 19 15:51:54 crc kubenswrapper[4771]: E0319 15:51:54.111732 4771 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Mar 19 15:51:54 crc kubenswrapper[4771]: I0319 15:51:54.245893 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n7cxt" Mar 19 15:51:54 crc kubenswrapper[4771]: I0319 15:51:54.395874 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3443197-32ed-4688-85dc-c0e8242fb4e5-catalog-content\") pod \"a3443197-32ed-4688-85dc-c0e8242fb4e5\" (UID: \"a3443197-32ed-4688-85dc-c0e8242fb4e5\") " Mar 19 15:51:54 crc kubenswrapper[4771]: I0319 15:51:54.395952 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3443197-32ed-4688-85dc-c0e8242fb4e5-utilities\") pod \"a3443197-32ed-4688-85dc-c0e8242fb4e5\" (UID: \"a3443197-32ed-4688-85dc-c0e8242fb4e5\") " Mar 19 15:51:54 crc kubenswrapper[4771]: I0319 15:51:54.396022 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9b67\" (UniqueName: \"kubernetes.io/projected/a3443197-32ed-4688-85dc-c0e8242fb4e5-kube-api-access-n9b67\") pod \"a3443197-32ed-4688-85dc-c0e8242fb4e5\" (UID: \"a3443197-32ed-4688-85dc-c0e8242fb4e5\") " Mar 19 15:51:54 crc kubenswrapper[4771]: I0319 15:51:54.396727 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3443197-32ed-4688-85dc-c0e8242fb4e5-utilities" (OuterVolumeSpecName: "utilities") pod "a3443197-32ed-4688-85dc-c0e8242fb4e5" (UID: "a3443197-32ed-4688-85dc-c0e8242fb4e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:51:54 crc kubenswrapper[4771]: I0319 15:51:54.400783 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3443197-32ed-4688-85dc-c0e8242fb4e5-kube-api-access-n9b67" (OuterVolumeSpecName: "kube-api-access-n9b67") pod "a3443197-32ed-4688-85dc-c0e8242fb4e5" (UID: "a3443197-32ed-4688-85dc-c0e8242fb4e5"). InnerVolumeSpecName "kube-api-access-n9b67". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:51:54 crc kubenswrapper[4771]: I0319 15:51:54.499942 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9b67\" (UniqueName: \"kubernetes.io/projected/a3443197-32ed-4688-85dc-c0e8242fb4e5-kube-api-access-n9b67\") on node \"crc\" DevicePath \"\"" Mar 19 15:51:54 crc kubenswrapper[4771]: I0319 15:51:54.500001 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3443197-32ed-4688-85dc-c0e8242fb4e5-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 15:51:54 crc kubenswrapper[4771]: I0319 15:51:54.507058 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3443197-32ed-4688-85dc-c0e8242fb4e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3443197-32ed-4688-85dc-c0e8242fb4e5" (UID: "a3443197-32ed-4688-85dc-c0e8242fb4e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:51:54 crc kubenswrapper[4771]: I0319 15:51:54.601599 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3443197-32ed-4688-85dc-c0e8242fb4e5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 15:51:55 crc kubenswrapper[4771]: I0319 15:51:55.091954 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7cxt" event={"ID":"a3443197-32ed-4688-85dc-c0e8242fb4e5","Type":"ContainerDied","Data":"7945ad1cadcf6e8593bbe62b08d95a1d8ce0a1365209893869889a110d39ec2e"} Mar 19 15:51:55 crc kubenswrapper[4771]: I0319 15:51:55.092047 4771 scope.go:117] "RemoveContainer" containerID="b301e9ca17f1d727b5be1139f8b63bd918c8b71db8de883ffa949f4c4073eaf5" Mar 19 15:51:55 crc kubenswrapper[4771]: I0319 15:51:55.092114 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n7cxt" Mar 19 15:51:55 crc kubenswrapper[4771]: I0319 15:51:55.119383 4771 scope.go:117] "RemoveContainer" containerID="0195da8d9497cbb3431f0dd28866edc22ade07ccd34bd0f3629f488b9d00cd87" Mar 19 15:51:55 crc kubenswrapper[4771]: I0319 15:51:55.154790 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n7cxt"] Mar 19 15:51:55 crc kubenswrapper[4771]: I0319 15:51:55.167829 4771 scope.go:117] "RemoveContainer" containerID="10b8bb6779a8a2d2e861a865878b185d71ba4a54ecda2d8c900ef7b80488e1c6" Mar 19 15:51:55 crc kubenswrapper[4771]: I0319 15:51:55.168971 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n7cxt"] Mar 19 15:51:55 crc kubenswrapper[4771]: I0319 15:51:55.522711 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3443197-32ed-4688-85dc-c0e8242fb4e5" path="/var/lib/kubelet/pods/a3443197-32ed-4688-85dc-c0e8242fb4e5/volumes" Mar 19 15:52:00 crc kubenswrapper[4771]: I0319 15:52:00.146870 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565592-vhvjg"] Mar 19 15:52:00 crc kubenswrapper[4771]: E0319 15:52:00.147772 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3443197-32ed-4688-85dc-c0e8242fb4e5" containerName="extract-utilities" Mar 19 15:52:00 crc kubenswrapper[4771]: I0319 15:52:00.147787 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3443197-32ed-4688-85dc-c0e8242fb4e5" containerName="extract-utilities" Mar 19 15:52:00 crc kubenswrapper[4771]: E0319 15:52:00.147803 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3443197-32ed-4688-85dc-c0e8242fb4e5" containerName="registry-server" Mar 19 15:52:00 crc kubenswrapper[4771]: I0319 15:52:00.147811 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3443197-32ed-4688-85dc-c0e8242fb4e5" containerName="registry-server" Mar 19 15:52:00 crc kubenswrapper[4771]: E0319 15:52:00.147832 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f4b1cce-3784-4c61-9110-f80f41742e0c" containerName="extract-content" Mar 19 15:52:00 crc kubenswrapper[4771]: I0319 15:52:00.147840 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f4b1cce-3784-4c61-9110-f80f41742e0c" containerName="extract-content" Mar 19 15:52:00 crc kubenswrapper[4771]: E0319 15:52:00.147858 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d975a0-5943-4643-a74f-6a5b35efd538" containerName="extract-content" Mar 19 15:52:00 crc kubenswrapper[4771]: I0319 15:52:00.147866 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d975a0-5943-4643-a74f-6a5b35efd538" containerName="extract-content" Mar 19 15:52:00 crc kubenswrapper[4771]: E0319 15:52:00.147880 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f4b1cce-3784-4c61-9110-f80f41742e0c" containerName="extract-utilities" Mar 19 15:52:00 crc kubenswrapper[4771]: I0319 15:52:00.147889 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f4b1cce-3784-4c61-9110-f80f41742e0c" containerName="extract-utilities" Mar 19 15:52:00 crc kubenswrapper[4771]: E0319 15:52:00.147907 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d975a0-5943-4643-a74f-6a5b35efd538" containerName="extract-utilities" Mar 19 15:52:00 crc kubenswrapper[4771]: I0319 15:52:00.147916 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d975a0-5943-4643-a74f-6a5b35efd538" containerName="extract-utilities" Mar 19 15:52:00 crc kubenswrapper[4771]: E0319 15:52:00.147928 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f4b1cce-3784-4c61-9110-f80f41742e0c" containerName="registry-server" Mar 19 15:52:00 crc kubenswrapper[4771]: I0319 15:52:00.147936 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f4b1cce-3784-4c61-9110-f80f41742e0c" containerName="registry-server" Mar 19 15:52:00 crc kubenswrapper[4771]: E0319 15:52:00.147946 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3443197-32ed-4688-85dc-c0e8242fb4e5" containerName="extract-content" Mar 19 15:52:00 crc kubenswrapper[4771]: I0319 15:52:00.147953 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3443197-32ed-4688-85dc-c0e8242fb4e5" containerName="extract-content" Mar 19 15:52:00 crc kubenswrapper[4771]: E0319 15:52:00.147969 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d975a0-5943-4643-a74f-6a5b35efd538" containerName="registry-server" Mar 19 15:52:00 crc kubenswrapper[4771]: I0319 15:52:00.147976 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d975a0-5943-4643-a74f-6a5b35efd538" containerName="registry-server" Mar 19 15:52:00 crc kubenswrapper[4771]: I0319 15:52:00.148169 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f4b1cce-3784-4c61-9110-f80f41742e0c" containerName="registry-server" Mar 19 15:52:00 crc kubenswrapper[4771]: I0319 15:52:00.148185 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3443197-32ed-4688-85dc-c0e8242fb4e5" containerName="registry-server" Mar 19 15:52:00 crc kubenswrapper[4771]: I0319 15:52:00.148211 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3d975a0-5943-4643-a74f-6a5b35efd538" containerName="registry-server" Mar 19 15:52:00 crc kubenswrapper[4771]: I0319 15:52:00.148782 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565592-vhvjg" Mar 19 15:52:00 crc kubenswrapper[4771]: I0319 15:52:00.153245 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 15:52:00 crc kubenswrapper[4771]: I0319 15:52:00.154242 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k42k7" Mar 19 15:52:00 crc kubenswrapper[4771]: I0319 15:52:00.154742 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 15:52:00 crc kubenswrapper[4771]: I0319 15:52:00.159314 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565592-vhvjg"] Mar 19 15:52:00 crc kubenswrapper[4771]: I0319 15:52:00.312793 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75xv4\" (UniqueName: \"kubernetes.io/projected/aa4e8243-5a8d-43e8-816e-d1f92d947d7b-kube-api-access-75xv4\") pod \"auto-csr-approver-29565592-vhvjg\" (UID: \"aa4e8243-5a8d-43e8-816e-d1f92d947d7b\") " pod="openshift-infra/auto-csr-approver-29565592-vhvjg" Mar 19 15:52:00 crc kubenswrapper[4771]: I0319 15:52:00.414817 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75xv4\" (UniqueName: \"kubernetes.io/projected/aa4e8243-5a8d-43e8-816e-d1f92d947d7b-kube-api-access-75xv4\") pod \"auto-csr-approver-29565592-vhvjg\" (UID: \"aa4e8243-5a8d-43e8-816e-d1f92d947d7b\") " pod="openshift-infra/auto-csr-approver-29565592-vhvjg" Mar 19 15:52:00 crc kubenswrapper[4771]: I0319 15:52:00.434923 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75xv4\" (UniqueName: \"kubernetes.io/projected/aa4e8243-5a8d-43e8-816e-d1f92d947d7b-kube-api-access-75xv4\") pod \"auto-csr-approver-29565592-vhvjg\" (UID: \"aa4e8243-5a8d-43e8-816e-d1f92d947d7b\") " pod="openshift-infra/auto-csr-approver-29565592-vhvjg" Mar 19 15:52:00 crc kubenswrapper[4771]: I0319 15:52:00.510612 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565592-vhvjg" Mar 19 15:52:00 crc kubenswrapper[4771]: I0319 15:52:00.970865 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565592-vhvjg"] Mar 19 15:52:01 crc kubenswrapper[4771]: I0319 15:52:01.147226 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565592-vhvjg" event={"ID":"aa4e8243-5a8d-43e8-816e-d1f92d947d7b","Type":"ContainerStarted","Data":"a66e977b3ed08175637d51902e092ceca464a69a9fad088adf45ebf87cb92686"} Mar 19 15:52:02 crc kubenswrapper[4771]: I0319 15:52:02.509054 4771 scope.go:117] "RemoveContainer" containerID="b679eb3a4980c447710e4dcc79c2ab79590325c537c0399ecfa57c365bb2ecc3" Mar 19 15:52:02 crc kubenswrapper[4771]: E0319 15:52:02.509764 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:52:03 crc kubenswrapper[4771]: I0319 15:52:03.163705 4771 generic.go:334] "Generic (PLEG): container finished" podID="aa4e8243-5a8d-43e8-816e-d1f92d947d7b" containerID="e337875a05a23413148288438d66390227ba7be1e41eb96ec05c723812e9e553" exitCode=0 Mar 19 15:52:03 crc kubenswrapper[4771]: I0319 15:52:03.163759 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565592-vhvjg" event={"ID":"aa4e8243-5a8d-43e8-816e-d1f92d947d7b","Type":"ContainerDied","Data":"e337875a05a23413148288438d66390227ba7be1e41eb96ec05c723812e9e553"} Mar 19 15:52:04 crc kubenswrapper[4771]: I0319 15:52:04.513418 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565592-vhvjg" Mar 19 15:52:04 crc kubenswrapper[4771]: I0319 15:52:04.679544 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75xv4\" (UniqueName: \"kubernetes.io/projected/aa4e8243-5a8d-43e8-816e-d1f92d947d7b-kube-api-access-75xv4\") pod \"aa4e8243-5a8d-43e8-816e-d1f92d947d7b\" (UID: \"aa4e8243-5a8d-43e8-816e-d1f92d947d7b\") " Mar 19 15:52:04 crc kubenswrapper[4771]: I0319 15:52:04.685526 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa4e8243-5a8d-43e8-816e-d1f92d947d7b-kube-api-access-75xv4" (OuterVolumeSpecName: "kube-api-access-75xv4") pod "aa4e8243-5a8d-43e8-816e-d1f92d947d7b" (UID: "aa4e8243-5a8d-43e8-816e-d1f92d947d7b"). InnerVolumeSpecName "kube-api-access-75xv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:52:04 crc kubenswrapper[4771]: I0319 15:52:04.781481 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75xv4\" (UniqueName: \"kubernetes.io/projected/aa4e8243-5a8d-43e8-816e-d1f92d947d7b-kube-api-access-75xv4\") on node \"crc\" DevicePath \"\"" Mar 19 15:52:05 crc kubenswrapper[4771]: I0319 15:52:05.184154 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565592-vhvjg" event={"ID":"aa4e8243-5a8d-43e8-816e-d1f92d947d7b","Type":"ContainerDied","Data":"a66e977b3ed08175637d51902e092ceca464a69a9fad088adf45ebf87cb92686"} Mar 19 15:52:05 crc kubenswrapper[4771]: I0319 15:52:05.184634 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a66e977b3ed08175637d51902e092ceca464a69a9fad088adf45ebf87cb92686" Mar 19 15:52:05 crc kubenswrapper[4771]: I0319 15:52:05.184229 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565592-vhvjg" Mar 19 15:52:05 crc kubenswrapper[4771]: I0319 15:52:05.589193 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565586-7r4gc"] Mar 19 15:52:05 crc kubenswrapper[4771]: I0319 15:52:05.596231 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565586-7r4gc"] Mar 19 15:52:07 crc kubenswrapper[4771]: I0319 15:52:07.532087 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc761847-2b88-49a3-ae95-0229cbb8bd98" path="/var/lib/kubelet/pods/cc761847-2b88-49a3-ae95-0229cbb8bd98/volumes" Mar 19 15:52:08 crc kubenswrapper[4771]: I0319 15:52:08.508074 4771 scope.go:117] "RemoveContainer" containerID="1ef12b655dd9ef435998cead844899fa42656fb2e2850f4f2ef645f3aa3016ce" Mar 19 15:52:09 crc kubenswrapper[4771]: I0319 15:52:09.219771 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74c5f622-0ced-47f9-80d5-75a09acfafc0","Type":"ContainerStarted","Data":"bf7e543b59d7180ae0704df5c3052b12dbb17a20b7e91128baa2e487fe3cf8d2"} Mar 19 15:52:09 crc kubenswrapper[4771]: I0319 15:52:09.220355 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 19 15:52:13 crc kubenswrapper[4771]: I0319 15:52:13.254948 4771 generic.go:334] "Generic (PLEG): container finished" podID="74c5f622-0ced-47f9-80d5-75a09acfafc0" containerID="bf7e543b59d7180ae0704df5c3052b12dbb17a20b7e91128baa2e487fe3cf8d2" exitCode=0 Mar 19 15:52:13 crc kubenswrapper[4771]: I0319 15:52:13.255043 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74c5f622-0ced-47f9-80d5-75a09acfafc0","Type":"ContainerDied","Data":"bf7e543b59d7180ae0704df5c3052b12dbb17a20b7e91128baa2e487fe3cf8d2"} Mar 19 15:52:13 crc kubenswrapper[4771]: I0319 15:52:13.255542 4771 scope.go:117] "RemoveContainer" containerID="1ef12b655dd9ef435998cead844899fa42656fb2e2850f4f2ef645f3aa3016ce" Mar 19 15:52:13 crc kubenswrapper[4771]: I0319 15:52:13.256164 4771 scope.go:117] "RemoveContainer" containerID="bf7e543b59d7180ae0704df5c3052b12dbb17a20b7e91128baa2e487fe3cf8d2" Mar 19 15:52:13 crc kubenswrapper[4771]: E0319 15:52:13.256375 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:52:16 crc kubenswrapper[4771]: I0319 15:52:16.509363 4771 scope.go:117] "RemoveContainer" containerID="b679eb3a4980c447710e4dcc79c2ab79590325c537c0399ecfa57c365bb2ecc3" Mar 19 15:52:17 crc kubenswrapper[4771]: I0319 15:52:17.299474 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c065c328-37e2-4905-9d1e-82208eab196e","Type":"ContainerStarted","Data":"52755e84d949e7fdcf58f4218d02c378b0dad1819547d2ae362e5bdb04354539"} Mar 19 15:52:17 crc kubenswrapper[4771]: I0319 15:52:17.300821 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:52:21 crc kubenswrapper[4771]: I0319 15:52:21.348374 4771 generic.go:334] "Generic (PLEG): container finished" podID="c065c328-37e2-4905-9d1e-82208eab196e" containerID="52755e84d949e7fdcf58f4218d02c378b0dad1819547d2ae362e5bdb04354539" exitCode=0 Mar 19 15:52:21 crc kubenswrapper[4771]: I0319 15:52:21.348435 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c065c328-37e2-4905-9d1e-82208eab196e","Type":"ContainerDied","Data":"52755e84d949e7fdcf58f4218d02c378b0dad1819547d2ae362e5bdb04354539"} Mar 19 15:52:21 crc kubenswrapper[4771]: I0319 15:52:21.348484 4771 scope.go:117] "RemoveContainer" containerID="b679eb3a4980c447710e4dcc79c2ab79590325c537c0399ecfa57c365bb2ecc3" Mar 19 15:52:21 crc kubenswrapper[4771]: I0319 15:52:21.349412 4771 scope.go:117] "RemoveContainer" containerID="52755e84d949e7fdcf58f4218d02c378b0dad1819547d2ae362e5bdb04354539" Mar 19 15:52:21 crc kubenswrapper[4771]: E0319 15:52:21.350015 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:52:24 crc kubenswrapper[4771]: I0319 15:52:24.509175 4771 scope.go:117] "RemoveContainer" containerID="bf7e543b59d7180ae0704df5c3052b12dbb17a20b7e91128baa2e487fe3cf8d2" Mar 19 15:52:24 crc kubenswrapper[4771]: E0319 15:52:24.510497 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:52:32 crc kubenswrapper[4771]: I0319 15:52:32.509303 4771 scope.go:117] "RemoveContainer" containerID="52755e84d949e7fdcf58f4218d02c378b0dad1819547d2ae362e5bdb04354539" Mar 19 15:52:32 crc kubenswrapper[4771]: E0319 15:52:32.510161 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:52:37 crc kubenswrapper[4771]: I0319 15:52:37.508389 4771 scope.go:117] "RemoveContainer" containerID="bf7e543b59d7180ae0704df5c3052b12dbb17a20b7e91128baa2e487fe3cf8d2" Mar 19 15:52:37 crc kubenswrapper[4771]: E0319 15:52:37.509223 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:52:44 crc kubenswrapper[4771]: I0319 15:52:44.508470 4771 scope.go:117] "RemoveContainer" containerID="52755e84d949e7fdcf58f4218d02c378b0dad1819547d2ae362e5bdb04354539" Mar 19 15:52:44 crc kubenswrapper[4771]: E0319 15:52:44.509497 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:52:48 crc kubenswrapper[4771]: I0319 15:52:48.509337 4771 scope.go:117] "RemoveContainer" containerID="bf7e543b59d7180ae0704df5c3052b12dbb17a20b7e91128baa2e487fe3cf8d2" Mar 19 15:52:48 crc kubenswrapper[4771]: E0319 15:52:48.510150 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:52:53 crc kubenswrapper[4771]: I0319 15:52:53.027548 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:52:53 crc kubenswrapper[4771]: I0319 15:52:53.028184 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:52:55 crc kubenswrapper[4771]: I0319 15:52:55.508931 4771 scope.go:117] "RemoveContainer" containerID="52755e84d949e7fdcf58f4218d02c378b0dad1819547d2ae362e5bdb04354539" Mar 19 15:52:55 crc kubenswrapper[4771]: E0319 15:52:55.509571 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:52:58 crc kubenswrapper[4771]: I0319 15:52:58.617788 4771 scope.go:117] "RemoveContainer" containerID="bacf2a660e041a39d1d3ab52925d8cf8a5121c96f14de231eeaebb0311c7feac" Mar 19 15:53:01 crc kubenswrapper[4771]: I0319 15:53:01.515251 4771 scope.go:117] "RemoveContainer" containerID="bf7e543b59d7180ae0704df5c3052b12dbb17a20b7e91128baa2e487fe3cf8d2" Mar 19 15:53:01 crc kubenswrapper[4771]: E0319 15:53:01.516341 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:53:07 crc kubenswrapper[4771]: I0319 15:53:07.508803 4771 scope.go:117] "RemoveContainer" containerID="52755e84d949e7fdcf58f4218d02c378b0dad1819547d2ae362e5bdb04354539" Mar 19 15:53:07 crc kubenswrapper[4771]: E0319 15:53:07.509929 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:53:13 crc kubenswrapper[4771]: I0319 15:53:13.509930 4771 scope.go:117] "RemoveContainer" containerID="bf7e543b59d7180ae0704df5c3052b12dbb17a20b7e91128baa2e487fe3cf8d2" Mar 19 15:53:13 crc kubenswrapper[4771]: E0319 15:53:13.511352 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:53:19 crc kubenswrapper[4771]: I0319 15:53:19.510354 4771 scope.go:117] "RemoveContainer" containerID="52755e84d949e7fdcf58f4218d02c378b0dad1819547d2ae362e5bdb04354539" Mar 19 15:53:19 crc kubenswrapper[4771]: E0319 15:53:19.511749 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:53:23 crc kubenswrapper[4771]: I0319 15:53:23.027821 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:53:23 crc kubenswrapper[4771]: I0319 15:53:23.028311 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:53:25 crc kubenswrapper[4771]: I0319 15:53:25.509205 4771 scope.go:117] "RemoveContainer" containerID="bf7e543b59d7180ae0704df5c3052b12dbb17a20b7e91128baa2e487fe3cf8d2" Mar 19 15:53:25 crc kubenswrapper[4771]: E0319 15:53:25.510131 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:53:32 crc kubenswrapper[4771]: I0319 15:53:32.509638 4771 scope.go:117] "RemoveContainer" containerID="52755e84d949e7fdcf58f4218d02c378b0dad1819547d2ae362e5bdb04354539" Mar 19 15:53:32 crc kubenswrapper[4771]: E0319 15:53:32.510527 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:53:40 crc kubenswrapper[4771]: I0319 15:53:40.508545 4771 scope.go:117] "RemoveContainer" containerID="bf7e543b59d7180ae0704df5c3052b12dbb17a20b7e91128baa2e487fe3cf8d2" Mar 19 15:53:40 crc kubenswrapper[4771]: E0319 15:53:40.509441 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:53:43 crc kubenswrapper[4771]: I0319 15:53:43.509296 4771 scope.go:117] "RemoveContainer" containerID="52755e84d949e7fdcf58f4218d02c378b0dad1819547d2ae362e5bdb04354539" Mar 19 15:53:43 crc kubenswrapper[4771]: E0319 15:53:43.510599 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:53:51 crc kubenswrapper[4771]: I0319 15:53:51.515747 4771 scope.go:117] "RemoveContainer" containerID="bf7e543b59d7180ae0704df5c3052b12dbb17a20b7e91128baa2e487fe3cf8d2" Mar 19 15:53:51 crc kubenswrapper[4771]: E0319 15:53:51.516601 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:53:53 crc kubenswrapper[4771]: I0319 15:53:53.027776 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 15:53:53 crc kubenswrapper[4771]: I0319 15:53:53.027858 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 15:53:53 crc kubenswrapper[4771]: I0319 15:53:53.027917 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" Mar 19 15:53:53 crc kubenswrapper[4771]: I0319 15:53:53.028721 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100"} pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 15:53:53 crc kubenswrapper[4771]: I0319 15:53:53.028792 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" containerID="cri-o://308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100" gracePeriod=600 Mar 19 15:53:53 crc kubenswrapper[4771]: E0319 15:53:53.166328 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:53:53 crc kubenswrapper[4771]: I0319 15:53:53.212116 4771 generic.go:334] "Generic (PLEG): container finished" podID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerID="308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100" exitCode=0 Mar 19 15:53:53 crc kubenswrapper[4771]: I0319 15:53:53.212163 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" event={"ID":"f2b6e948-bbef-4217-b0eb-4cdbf711037c","Type":"ContainerDied","Data":"308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100"} Mar 19 15:53:53 crc kubenswrapper[4771]: I0319 15:53:53.212200 4771 scope.go:117] "RemoveContainer" containerID="07a56d01ab75b71b5af456be4839284ab56bbaec8d73ef490a93e8d9a6ad4e01" Mar 19 15:53:53 crc kubenswrapper[4771]: I0319 15:53:53.212779 4771 scope.go:117] "RemoveContainer" containerID="308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100" Mar 19 15:53:53 crc kubenswrapper[4771]: E0319 15:53:53.213094 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:53:58 crc kubenswrapper[4771]: I0319 15:53:58.509972 4771 scope.go:117] "RemoveContainer" containerID="52755e84d949e7fdcf58f4218d02c378b0dad1819547d2ae362e5bdb04354539" Mar 19 15:53:58 crc kubenswrapper[4771]: E0319 15:53:58.511538 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:54:00 crc kubenswrapper[4771]: I0319 15:54:00.156485 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565594-f7lhs"] Mar 19 15:54:00 crc kubenswrapper[4771]: E0319 15:54:00.156890 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa4e8243-5a8d-43e8-816e-d1f92d947d7b" containerName="oc" Mar 19 15:54:00 crc kubenswrapper[4771]: I0319 15:54:00.156905 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa4e8243-5a8d-43e8-816e-d1f92d947d7b" containerName="oc" Mar 19 15:54:00 crc kubenswrapper[4771]: I0319 15:54:00.157141 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa4e8243-5a8d-43e8-816e-d1f92d947d7b" containerName="oc" Mar 19 15:54:00 crc kubenswrapper[4771]: I0319 15:54:00.157957 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565594-f7lhs" Mar 19 15:54:00 crc kubenswrapper[4771]: I0319 15:54:00.161041 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 15:54:00 crc kubenswrapper[4771]: I0319 15:54:00.161315 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k42k7" Mar 19 15:54:00 crc kubenswrapper[4771]: I0319 15:54:00.161539 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 15:54:00 crc kubenswrapper[4771]: I0319 15:54:00.168412 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565594-f7lhs"] Mar 19 15:54:00 crc kubenswrapper[4771]: I0319 15:54:00.258083 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-276f2\" (UniqueName: \"kubernetes.io/projected/acbc2604-3df1-40dc-a50b-7c516c8ee3c6-kube-api-access-276f2\") pod \"auto-csr-approver-29565594-f7lhs\" (UID: \"acbc2604-3df1-40dc-a50b-7c516c8ee3c6\") " pod="openshift-infra/auto-csr-approver-29565594-f7lhs" Mar 19 15:54:00 crc kubenswrapper[4771]: I0319 15:54:00.360172 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-276f2\" (UniqueName: \"kubernetes.io/projected/acbc2604-3df1-40dc-a50b-7c516c8ee3c6-kube-api-access-276f2\") pod \"auto-csr-approver-29565594-f7lhs\" (UID: \"acbc2604-3df1-40dc-a50b-7c516c8ee3c6\") " pod="openshift-infra/auto-csr-approver-29565594-f7lhs" Mar 19 15:54:00 crc kubenswrapper[4771]: I0319 15:54:00.381126 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-276f2\" (UniqueName: \"kubernetes.io/projected/acbc2604-3df1-40dc-a50b-7c516c8ee3c6-kube-api-access-276f2\") pod \"auto-csr-approver-29565594-f7lhs\" (UID: \"acbc2604-3df1-40dc-a50b-7c516c8ee3c6\") " pod="openshift-infra/auto-csr-approver-29565594-f7lhs" Mar 19 15:54:00 crc kubenswrapper[4771]: I0319 15:54:00.480642 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565594-f7lhs" Mar 19 15:54:00 crc kubenswrapper[4771]: I0319 15:54:00.981100 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565594-f7lhs"] Mar 19 15:54:01 crc kubenswrapper[4771]: I0319 15:54:01.292753 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565594-f7lhs" event={"ID":"acbc2604-3df1-40dc-a50b-7c516c8ee3c6","Type":"ContainerStarted","Data":"398d3b3a958f8762c3ea682fd1d89dfbebf8d0e4218954e5281cde5781a2d42f"} Mar 19 15:54:03 crc kubenswrapper[4771]: I0319 15:54:03.314340 4771 generic.go:334] "Generic (PLEG): container finished" podID="acbc2604-3df1-40dc-a50b-7c516c8ee3c6" containerID="ba30c999595dde6311398027fb6f1ad65a93d653d1ed44df2b81a82c88b8cc19" exitCode=0 Mar 19 15:54:03 crc kubenswrapper[4771]: I0319 15:54:03.314437 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565594-f7lhs" event={"ID":"acbc2604-3df1-40dc-a50b-7c516c8ee3c6","Type":"ContainerDied","Data":"ba30c999595dde6311398027fb6f1ad65a93d653d1ed44df2b81a82c88b8cc19"} Mar 19 15:54:04 crc kubenswrapper[4771]: I0319 15:54:04.775309 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565594-f7lhs" Mar 19 15:54:04 crc kubenswrapper[4771]: I0319 15:54:04.857505 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-276f2\" (UniqueName: \"kubernetes.io/projected/acbc2604-3df1-40dc-a50b-7c516c8ee3c6-kube-api-access-276f2\") pod \"acbc2604-3df1-40dc-a50b-7c516c8ee3c6\" (UID: \"acbc2604-3df1-40dc-a50b-7c516c8ee3c6\") " Mar 19 15:54:04 crc kubenswrapper[4771]: I0319 15:54:04.863142 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acbc2604-3df1-40dc-a50b-7c516c8ee3c6-kube-api-access-276f2" (OuterVolumeSpecName: "kube-api-access-276f2") pod "acbc2604-3df1-40dc-a50b-7c516c8ee3c6" (UID: "acbc2604-3df1-40dc-a50b-7c516c8ee3c6"). InnerVolumeSpecName "kube-api-access-276f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:54:04 crc kubenswrapper[4771]: I0319 15:54:04.959509 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-276f2\" (UniqueName: \"kubernetes.io/projected/acbc2604-3df1-40dc-a50b-7c516c8ee3c6-kube-api-access-276f2\") on node \"crc\" DevicePath \"\"" Mar 19 15:54:05 crc kubenswrapper[4771]: I0319 15:54:05.340489 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565594-f7lhs" event={"ID":"acbc2604-3df1-40dc-a50b-7c516c8ee3c6","Type":"ContainerDied","Data":"398d3b3a958f8762c3ea682fd1d89dfbebf8d0e4218954e5281cde5781a2d42f"} Mar 19 15:54:05 crc kubenswrapper[4771]: I0319 15:54:05.340554 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="398d3b3a958f8762c3ea682fd1d89dfbebf8d0e4218954e5281cde5781a2d42f" Mar 19 15:54:05 crc kubenswrapper[4771]: I0319 15:54:05.340634 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565594-f7lhs" Mar 19 15:54:05 crc kubenswrapper[4771]: I0319 15:54:05.508903 4771 scope.go:117] "RemoveContainer" containerID="bf7e543b59d7180ae0704df5c3052b12dbb17a20b7e91128baa2e487fe3cf8d2" Mar 19 15:54:05 crc kubenswrapper[4771]: E0319 15:54:05.509351 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:54:05 crc kubenswrapper[4771]: I0319 15:54:05.866702 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565588-zswmt"] Mar 19 15:54:05 crc kubenswrapper[4771]: I0319 15:54:05.872156 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565588-zswmt"] Mar 19 15:54:07 crc kubenswrapper[4771]: I0319 15:54:07.528514 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f245468b-c53c-4eff-9a82-a8ed0153a10d" path="/var/lib/kubelet/pods/f245468b-c53c-4eff-9a82-a8ed0153a10d/volumes" Mar 19 15:54:08 crc kubenswrapper[4771]: I0319 15:54:08.509259 4771 scope.go:117] "RemoveContainer" containerID="308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100" Mar 19 15:54:08 crc kubenswrapper[4771]: E0319 15:54:08.509469 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:54:12 crc kubenswrapper[4771]: I0319 15:54:12.510680 4771 scope.go:117] "RemoveContainer" containerID="52755e84d949e7fdcf58f4218d02c378b0dad1819547d2ae362e5bdb04354539" Mar 19 15:54:12 crc kubenswrapper[4771]: E0319 15:54:12.512609 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:54:19 crc kubenswrapper[4771]: I0319 15:54:19.509458 4771 scope.go:117] "RemoveContainer" containerID="bf7e543b59d7180ae0704df5c3052b12dbb17a20b7e91128baa2e487fe3cf8d2" Mar 19 15:54:19 crc kubenswrapper[4771]: E0319 15:54:19.510600 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:54:21 crc kubenswrapper[4771]: I0319 15:54:21.518658 4771 scope.go:117] "RemoveContainer" containerID="308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100" Mar 19 15:54:21 crc kubenswrapper[4771]: E0319 15:54:21.519598 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:54:26 crc kubenswrapper[4771]: I0319 15:54:26.509427 4771 scope.go:117] "RemoveContainer" containerID="52755e84d949e7fdcf58f4218d02c378b0dad1819547d2ae362e5bdb04354539" Mar 19 15:54:26 crc kubenswrapper[4771]: E0319 15:54:26.511315 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:54:31 crc kubenswrapper[4771]: I0319 15:54:31.513128 4771 scope.go:117] "RemoveContainer" containerID="bf7e543b59d7180ae0704df5c3052b12dbb17a20b7e91128baa2e487fe3cf8d2" Mar 19 15:54:31 crc kubenswrapper[4771]: E0319 15:54:31.513758 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:54:33 crc kubenswrapper[4771]: I0319 15:54:33.509360 4771 scope.go:117] "RemoveContainer" containerID="308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100" Mar 19 15:54:33 crc kubenswrapper[4771]: E0319 15:54:33.510349 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:54:37 crc kubenswrapper[4771]: I0319 15:54:37.509740 4771 scope.go:117] "RemoveContainer" containerID="52755e84d949e7fdcf58f4218d02c378b0dad1819547d2ae362e5bdb04354539" Mar 19 15:54:37 crc kubenswrapper[4771]: E0319 15:54:37.512978 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:54:38 crc kubenswrapper[4771]: I0319 15:54:38.225434 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vjs4k"] Mar 19 15:54:38 crc kubenswrapper[4771]: E0319 15:54:38.225796 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acbc2604-3df1-40dc-a50b-7c516c8ee3c6" containerName="oc" Mar 19 15:54:38 crc kubenswrapper[4771]: I0319 15:54:38.226025 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="acbc2604-3df1-40dc-a50b-7c516c8ee3c6" containerName="oc" Mar 19 15:54:38 crc kubenswrapper[4771]: I0319 15:54:38.226230 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="acbc2604-3df1-40dc-a50b-7c516c8ee3c6" containerName="oc" Mar 19 15:54:38 crc kubenswrapper[4771]: I0319 15:54:38.227563 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vjs4k" Mar 19 15:54:38 crc kubenswrapper[4771]: I0319 15:54:38.249809 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vjs4k"] Mar 19 15:54:38 crc kubenswrapper[4771]: I0319 15:54:38.333142 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6a23432-56c1-4062-b0cd-bad1e6bed575-catalog-content\") pod \"certified-operators-vjs4k\" (UID: \"f6a23432-56c1-4062-b0cd-bad1e6bed575\") " pod="openshift-marketplace/certified-operators-vjs4k" Mar 19 15:54:38 crc kubenswrapper[4771]: I0319 15:54:38.333261 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrjzl\" (UniqueName: \"kubernetes.io/projected/f6a23432-56c1-4062-b0cd-bad1e6bed575-kube-api-access-lrjzl\") pod \"certified-operators-vjs4k\" (UID: \"f6a23432-56c1-4062-b0cd-bad1e6bed575\") " pod="openshift-marketplace/certified-operators-vjs4k" Mar 19 15:54:38 crc kubenswrapper[4771]: I0319 15:54:38.333311 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6a23432-56c1-4062-b0cd-bad1e6bed575-utilities\") pod \"certified-operators-vjs4k\" (UID: \"f6a23432-56c1-4062-b0cd-bad1e6bed575\") " pod="openshift-marketplace/certified-operators-vjs4k" Mar 19 15:54:38 crc kubenswrapper[4771]: I0319 15:54:38.434917 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6a23432-56c1-4062-b0cd-bad1e6bed575-catalog-content\") pod \"certified-operators-vjs4k\" (UID: \"f6a23432-56c1-4062-b0cd-bad1e6bed575\") " pod="openshift-marketplace/certified-operators-vjs4k" Mar 19 15:54:38 crc kubenswrapper[4771]: I0319 15:54:38.436331 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrjzl\" (UniqueName: \"kubernetes.io/projected/f6a23432-56c1-4062-b0cd-bad1e6bed575-kube-api-access-lrjzl\") pod \"certified-operators-vjs4k\" (UID: \"f6a23432-56c1-4062-b0cd-bad1e6bed575\") " pod="openshift-marketplace/certified-operators-vjs4k" Mar 19 15:54:38 crc kubenswrapper[4771]: I0319 15:54:38.436414 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6a23432-56c1-4062-b0cd-bad1e6bed575-utilities\") pod \"certified-operators-vjs4k\" (UID: \"f6a23432-56c1-4062-b0cd-bad1e6bed575\") " pod="openshift-marketplace/certified-operators-vjs4k" Mar 19 15:54:38 crc kubenswrapper[4771]: I0319 15:54:38.436881 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6a23432-56c1-4062-b0cd-bad1e6bed575-utilities\") pod \"certified-operators-vjs4k\" (UID: \"f6a23432-56c1-4062-b0cd-bad1e6bed575\") " pod="openshift-marketplace/certified-operators-vjs4k" Mar 19 15:54:38 crc kubenswrapper[4771]: I0319 15:54:38.436186 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6a23432-56c1-4062-b0cd-bad1e6bed575-catalog-content\") pod \"certified-operators-vjs4k\" (UID: \"f6a23432-56c1-4062-b0cd-bad1e6bed575\") " pod="openshift-marketplace/certified-operators-vjs4k" Mar 19 15:54:38 crc kubenswrapper[4771]: I0319 15:54:38.469534 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrjzl\" (UniqueName: \"kubernetes.io/projected/f6a23432-56c1-4062-b0cd-bad1e6bed575-kube-api-access-lrjzl\") pod \"certified-operators-vjs4k\" (UID: \"f6a23432-56c1-4062-b0cd-bad1e6bed575\") " pod="openshift-marketplace/certified-operators-vjs4k" Mar 19 15:54:38 crc kubenswrapper[4771]: I0319 15:54:38.558672 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vjs4k" Mar 19 15:54:39 crc kubenswrapper[4771]: I0319 15:54:39.023786 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vjs4k"] Mar 19 15:54:39 crc kubenswrapper[4771]: I0319 15:54:39.675142 4771 generic.go:334] "Generic (PLEG): container finished" podID="f6a23432-56c1-4062-b0cd-bad1e6bed575" containerID="2ed7de14bbd6004dc5cef563a8acea72ab1dffdbdfa99929c260944bef70609d" exitCode=0 Mar 19 15:54:39 crc kubenswrapper[4771]: I0319 15:54:39.675185 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vjs4k" event={"ID":"f6a23432-56c1-4062-b0cd-bad1e6bed575","Type":"ContainerDied","Data":"2ed7de14bbd6004dc5cef563a8acea72ab1dffdbdfa99929c260944bef70609d"} Mar 19 15:54:39 crc kubenswrapper[4771]: I0319 15:54:39.675209 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vjs4k" event={"ID":"f6a23432-56c1-4062-b0cd-bad1e6bed575","Type":"ContainerStarted","Data":"6fb698a2e6fd39feda96364850fdb9bf4fa26787c673f07beddccb0260df443c"} Mar 19 15:54:41 crc kubenswrapper[4771]: I0319 15:54:41.703683 4771 generic.go:334] "Generic (PLEG): container finished" podID="f6a23432-56c1-4062-b0cd-bad1e6bed575" containerID="1c81524b5f8e7e98e0711c00771542fd14ea1eb84ccd43d10d28f1072e5e9eec" exitCode=0 Mar 19 15:54:41 crc kubenswrapper[4771]: I0319 15:54:41.704152 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vjs4k" event={"ID":"f6a23432-56c1-4062-b0cd-bad1e6bed575","Type":"ContainerDied","Data":"1c81524b5f8e7e98e0711c00771542fd14ea1eb84ccd43d10d28f1072e5e9eec"} Mar 19 15:54:42 crc kubenswrapper[4771]: I0319 15:54:42.714407 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vjs4k" event={"ID":"f6a23432-56c1-4062-b0cd-bad1e6bed575","Type":"ContainerStarted","Data":"8ddba97d75515a1835c9bc98600bd528d7c7029e3b309fc5b4e6087e26a72f60"} Mar 19 15:54:42 crc kubenswrapper[4771]: I0319 15:54:42.763750 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vjs4k" podStartSLOduration=2.267759175 podStartE2EDuration="4.763728247s" podCreationTimestamp="2026-03-19 15:54:38 +0000 UTC" firstStartedPulling="2026-03-19 15:54:39.677597774 +0000 UTC m=+2338.906218976" lastFinishedPulling="2026-03-19 15:54:42.173566856 +0000 UTC m=+2341.402188048" observedRunningTime="2026-03-19 15:54:42.746571226 +0000 UTC m=+2341.975192438" watchObservedRunningTime="2026-03-19 15:54:42.763728247 +0000 UTC m=+2341.992349469" Mar 19 15:54:43 crc kubenswrapper[4771]: I0319 15:54:43.509595 4771 scope.go:117] "RemoveContainer" containerID="bf7e543b59d7180ae0704df5c3052b12dbb17a20b7e91128baa2e487fe3cf8d2" Mar 19 15:54:43 crc kubenswrapper[4771]: E0319 15:54:43.509909 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:54:48 crc kubenswrapper[4771]: I0319 15:54:48.509841 4771 scope.go:117] "RemoveContainer" containerID="308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100" Mar 19 15:54:48 crc kubenswrapper[4771]: E0319 15:54:48.510770 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:54:48 crc kubenswrapper[4771]: I0319 15:54:48.559103 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vjs4k" Mar 19 15:54:48 crc kubenswrapper[4771]: I0319 15:54:48.559209 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vjs4k" Mar 19 15:54:48 crc kubenswrapper[4771]: I0319 15:54:48.619457 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vjs4k" Mar 19 15:54:48 crc kubenswrapper[4771]: I0319 15:54:48.838747 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vjs4k" Mar 19 15:54:48 crc kubenswrapper[4771]: I0319 15:54:48.887941 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vjs4k"] Mar 19 15:54:50 crc kubenswrapper[4771]: I0319 15:54:50.509516 4771 scope.go:117] "RemoveContainer" containerID="52755e84d949e7fdcf58f4218d02c378b0dad1819547d2ae362e5bdb04354539" Mar 19 15:54:50 crc kubenswrapper[4771]: E0319 15:54:50.510071 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:54:50 crc kubenswrapper[4771]: I0319 15:54:50.792884 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vjs4k" podUID="f6a23432-56c1-4062-b0cd-bad1e6bed575" containerName="registry-server" containerID="cri-o://8ddba97d75515a1835c9bc98600bd528d7c7029e3b309fc5b4e6087e26a72f60" gracePeriod=2 Mar 19 15:54:51 crc kubenswrapper[4771]: I0319 15:54:51.206313 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vjs4k" Mar 19 15:54:51 crc kubenswrapper[4771]: I0319 15:54:51.390371 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6a23432-56c1-4062-b0cd-bad1e6bed575-utilities\") pod \"f6a23432-56c1-4062-b0cd-bad1e6bed575\" (UID: \"f6a23432-56c1-4062-b0cd-bad1e6bed575\") " Mar 19 15:54:51 crc kubenswrapper[4771]: I0319 15:54:51.390424 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6a23432-56c1-4062-b0cd-bad1e6bed575-catalog-content\") pod \"f6a23432-56c1-4062-b0cd-bad1e6bed575\" (UID: \"f6a23432-56c1-4062-b0cd-bad1e6bed575\") " Mar 19 15:54:51 crc kubenswrapper[4771]: I0319 15:54:51.390487 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrjzl\" (UniqueName: \"kubernetes.io/projected/f6a23432-56c1-4062-b0cd-bad1e6bed575-kube-api-access-lrjzl\") pod \"f6a23432-56c1-4062-b0cd-bad1e6bed575\" (UID: \"f6a23432-56c1-4062-b0cd-bad1e6bed575\") " Mar 19 15:54:51 crc kubenswrapper[4771]: I0319 15:54:51.391340 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6a23432-56c1-4062-b0cd-bad1e6bed575-utilities" (OuterVolumeSpecName: "utilities") pod "f6a23432-56c1-4062-b0cd-bad1e6bed575" (UID: "f6a23432-56c1-4062-b0cd-bad1e6bed575"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:54:51 crc kubenswrapper[4771]: I0319 15:54:51.393611 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6a23432-56c1-4062-b0cd-bad1e6bed575-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 15:54:51 crc kubenswrapper[4771]: I0319 15:54:51.399207 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6a23432-56c1-4062-b0cd-bad1e6bed575-kube-api-access-lrjzl" (OuterVolumeSpecName: "kube-api-access-lrjzl") pod "f6a23432-56c1-4062-b0cd-bad1e6bed575" (UID: "f6a23432-56c1-4062-b0cd-bad1e6bed575"). InnerVolumeSpecName "kube-api-access-lrjzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:54:51 crc kubenswrapper[4771]: I0319 15:54:51.448568 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6a23432-56c1-4062-b0cd-bad1e6bed575-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6a23432-56c1-4062-b0cd-bad1e6bed575" (UID: "f6a23432-56c1-4062-b0cd-bad1e6bed575"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 15:54:51 crc kubenswrapper[4771]: I0319 15:54:51.495533 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6a23432-56c1-4062-b0cd-bad1e6bed575-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 15:54:51 crc kubenswrapper[4771]: I0319 15:54:51.495569 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrjzl\" (UniqueName: \"kubernetes.io/projected/f6a23432-56c1-4062-b0cd-bad1e6bed575-kube-api-access-lrjzl\") on node \"crc\" DevicePath \"\"" Mar 19 15:54:51 crc kubenswrapper[4771]: I0319 15:54:51.805567 4771 generic.go:334] "Generic (PLEG): container finished" podID="f6a23432-56c1-4062-b0cd-bad1e6bed575" containerID="8ddba97d75515a1835c9bc98600bd528d7c7029e3b309fc5b4e6087e26a72f60" exitCode=0 Mar 19 15:54:51 crc kubenswrapper[4771]: I0319 15:54:51.805714 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vjs4k" Mar 19 15:54:51 crc kubenswrapper[4771]: I0319 15:54:51.805741 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vjs4k" event={"ID":"f6a23432-56c1-4062-b0cd-bad1e6bed575","Type":"ContainerDied","Data":"8ddba97d75515a1835c9bc98600bd528d7c7029e3b309fc5b4e6087e26a72f60"} Mar 19 15:54:51 crc kubenswrapper[4771]: I0319 15:54:51.806828 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vjs4k" event={"ID":"f6a23432-56c1-4062-b0cd-bad1e6bed575","Type":"ContainerDied","Data":"6fb698a2e6fd39feda96364850fdb9bf4fa26787c673f07beddccb0260df443c"} Mar 19 15:54:51 crc kubenswrapper[4771]: I0319 15:54:51.806869 4771 scope.go:117] "RemoveContainer" containerID="8ddba97d75515a1835c9bc98600bd528d7c7029e3b309fc5b4e6087e26a72f60" Mar 19 15:54:51 crc kubenswrapper[4771]: I0319 15:54:51.830363 4771 scope.go:117] "RemoveContainer" containerID="1c81524b5f8e7e98e0711c00771542fd14ea1eb84ccd43d10d28f1072e5e9eec" Mar 19 15:54:51 crc kubenswrapper[4771]: I0319 15:54:51.836867 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vjs4k"] Mar 19 15:54:51 crc kubenswrapper[4771]: I0319 15:54:51.847068 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vjs4k"] Mar 19 15:54:51 crc kubenswrapper[4771]: I0319 15:54:51.868312 4771 scope.go:117] "RemoveContainer" containerID="2ed7de14bbd6004dc5cef563a8acea72ab1dffdbdfa99929c260944bef70609d" Mar 19 15:54:51 crc kubenswrapper[4771]: I0319 15:54:51.894742 4771 scope.go:117] "RemoveContainer" containerID="8ddba97d75515a1835c9bc98600bd528d7c7029e3b309fc5b4e6087e26a72f60" Mar 19 15:54:51 crc kubenswrapper[4771]: E0319 15:54:51.895218 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ddba97d75515a1835c9bc98600bd528d7c7029e3b309fc5b4e6087e26a72f60\": container with ID starting with 8ddba97d75515a1835c9bc98600bd528d7c7029e3b309fc5b4e6087e26a72f60 not found: ID does not exist" containerID="8ddba97d75515a1835c9bc98600bd528d7c7029e3b309fc5b4e6087e26a72f60" Mar 19 15:54:51 crc kubenswrapper[4771]: I0319 15:54:51.895295 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ddba97d75515a1835c9bc98600bd528d7c7029e3b309fc5b4e6087e26a72f60"} err="failed to get container status \"8ddba97d75515a1835c9bc98600bd528d7c7029e3b309fc5b4e6087e26a72f60\": rpc error: code = NotFound desc = could not find container \"8ddba97d75515a1835c9bc98600bd528d7c7029e3b309fc5b4e6087e26a72f60\": container with ID starting with 8ddba97d75515a1835c9bc98600bd528d7c7029e3b309fc5b4e6087e26a72f60 not found: ID does not exist" Mar 19 15:54:51 crc kubenswrapper[4771]: I0319 15:54:51.895330 4771 scope.go:117] "RemoveContainer" containerID="1c81524b5f8e7e98e0711c00771542fd14ea1eb84ccd43d10d28f1072e5e9eec" Mar 19 15:54:51 crc kubenswrapper[4771]: E0319 15:54:51.896390 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c81524b5f8e7e98e0711c00771542fd14ea1eb84ccd43d10d28f1072e5e9eec\": container with ID starting with 1c81524b5f8e7e98e0711c00771542fd14ea1eb84ccd43d10d28f1072e5e9eec not found: ID does not exist" containerID="1c81524b5f8e7e98e0711c00771542fd14ea1eb84ccd43d10d28f1072e5e9eec" Mar 19 15:54:51 crc kubenswrapper[4771]: I0319 15:54:51.896480 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c81524b5f8e7e98e0711c00771542fd14ea1eb84ccd43d10d28f1072e5e9eec"} err="failed to get container status \"1c81524b5f8e7e98e0711c00771542fd14ea1eb84ccd43d10d28f1072e5e9eec\": rpc error: code = NotFound desc = could not find container \"1c81524b5f8e7e98e0711c00771542fd14ea1eb84ccd43d10d28f1072e5e9eec\": container with ID starting with 1c81524b5f8e7e98e0711c00771542fd14ea1eb84ccd43d10d28f1072e5e9eec not found: ID does not exist" Mar 19 15:54:51 crc kubenswrapper[4771]: I0319 15:54:51.896508 4771 scope.go:117] "RemoveContainer" containerID="2ed7de14bbd6004dc5cef563a8acea72ab1dffdbdfa99929c260944bef70609d" Mar 19 15:54:51 crc kubenswrapper[4771]: E0319 15:54:51.896953 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ed7de14bbd6004dc5cef563a8acea72ab1dffdbdfa99929c260944bef70609d\": container with ID starting with 2ed7de14bbd6004dc5cef563a8acea72ab1dffdbdfa99929c260944bef70609d not found: ID does not exist" containerID="2ed7de14bbd6004dc5cef563a8acea72ab1dffdbdfa99929c260944bef70609d" Mar 19 15:54:51 crc kubenswrapper[4771]: I0319 15:54:51.897022 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ed7de14bbd6004dc5cef563a8acea72ab1dffdbdfa99929c260944bef70609d"} err="failed to get container status \"2ed7de14bbd6004dc5cef563a8acea72ab1dffdbdfa99929c260944bef70609d\": rpc error: code = NotFound desc = could not find container \"2ed7de14bbd6004dc5cef563a8acea72ab1dffdbdfa99929c260944bef70609d\": container with ID starting with 2ed7de14bbd6004dc5cef563a8acea72ab1dffdbdfa99929c260944bef70609d not found: ID does not exist" Mar 19 15:54:53 crc kubenswrapper[4771]: I0319 15:54:53.526814 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6a23432-56c1-4062-b0cd-bad1e6bed575" path="/var/lib/kubelet/pods/f6a23432-56c1-4062-b0cd-bad1e6bed575/volumes" Mar 19 15:54:57 crc kubenswrapper[4771]: I0319 15:54:57.509447 4771 scope.go:117] "RemoveContainer" containerID="bf7e543b59d7180ae0704df5c3052b12dbb17a20b7e91128baa2e487fe3cf8d2" Mar 19 15:54:57 crc kubenswrapper[4771]: E0319 15:54:57.510497 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:54:58 crc kubenswrapper[4771]: I0319 15:54:58.726625 4771 scope.go:117] "RemoveContainer" containerID="928457a83c024a6414511c683217b3b727209f17c6a604b5196a63cb6a243886" Mar 19 15:55:02 crc kubenswrapper[4771]: I0319 15:55:02.509166 4771 scope.go:117] "RemoveContainer" containerID="308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100" Mar 19 15:55:02 crc kubenswrapper[4771]: E0319 15:55:02.510601 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:55:04 crc kubenswrapper[4771]: I0319 15:55:04.508835 4771 scope.go:117] "RemoveContainer" containerID="52755e84d949e7fdcf58f4218d02c378b0dad1819547d2ae362e5bdb04354539" Mar 19 15:55:04 crc kubenswrapper[4771]: E0319 15:55:04.509662 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:55:12 crc kubenswrapper[4771]: I0319 15:55:12.509591 4771 scope.go:117] "RemoveContainer" containerID="bf7e543b59d7180ae0704df5c3052b12dbb17a20b7e91128baa2e487fe3cf8d2" Mar 19 15:55:12 crc kubenswrapper[4771]: E0319 15:55:12.511735 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:55:13 crc kubenswrapper[4771]: I0319 15:55:13.509729 4771 scope.go:117] "RemoveContainer" containerID="308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100" Mar 19 15:55:13 crc kubenswrapper[4771]: E0319 15:55:13.510276 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:55:19 crc kubenswrapper[4771]: I0319 15:55:19.509458 4771 scope.go:117] "RemoveContainer" containerID="52755e84d949e7fdcf58f4218d02c378b0dad1819547d2ae362e5bdb04354539" Mar 19 15:55:19 crc kubenswrapper[4771]: E0319 15:55:19.511713 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:55:24 crc kubenswrapper[4771]: I0319 15:55:24.509908 4771 scope.go:117] "RemoveContainer" containerID="308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100" Mar 19 15:55:24 crc kubenswrapper[4771]: E0319 15:55:24.510972 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:55:25 crc kubenswrapper[4771]: I0319 15:55:25.511194 4771 scope.go:117] "RemoveContainer" containerID="bf7e543b59d7180ae0704df5c3052b12dbb17a20b7e91128baa2e487fe3cf8d2" Mar 19 15:55:25 crc kubenswrapper[4771]: E0319 15:55:25.517082 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:55:33 crc kubenswrapper[4771]: I0319 15:55:33.509092 4771 scope.go:117] "RemoveContainer" containerID="52755e84d949e7fdcf58f4218d02c378b0dad1819547d2ae362e5bdb04354539" Mar 19 15:55:33 crc kubenswrapper[4771]: E0319 15:55:33.509919 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:55:36 crc kubenswrapper[4771]: I0319 15:55:36.509194 4771 scope.go:117] "RemoveContainer" containerID="308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100" Mar 19 15:55:36 crc kubenswrapper[4771]: E0319 15:55:36.509916 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:55:39 crc kubenswrapper[4771]: I0319 15:55:39.509521 4771 scope.go:117] "RemoveContainer" containerID="bf7e543b59d7180ae0704df5c3052b12dbb17a20b7e91128baa2e487fe3cf8d2" Mar 19 15:55:39 crc kubenswrapper[4771]: E0319 15:55:39.510730 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:55:45 crc kubenswrapper[4771]: I0319 15:55:45.509909 4771 scope.go:117] "RemoveContainer" containerID="52755e84d949e7fdcf58f4218d02c378b0dad1819547d2ae362e5bdb04354539" Mar 19 15:55:45 crc kubenswrapper[4771]: E0319 15:55:45.511255 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:55:51 crc kubenswrapper[4771]: I0319 15:55:51.516963 4771 scope.go:117] "RemoveContainer" containerID="308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100" Mar 19 15:55:51 crc kubenswrapper[4771]: E0319 15:55:51.517829 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:55:53 crc kubenswrapper[4771]: I0319 15:55:53.508922 4771 scope.go:117] "RemoveContainer" containerID="bf7e543b59d7180ae0704df5c3052b12dbb17a20b7e91128baa2e487fe3cf8d2" Mar 19 15:55:53 crc kubenswrapper[4771]: E0319 15:55:53.509280 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:55:59 crc kubenswrapper[4771]: I0319 15:55:59.509487 4771 scope.go:117] "RemoveContainer" containerID="52755e84d949e7fdcf58f4218d02c378b0dad1819547d2ae362e5bdb04354539" Mar 19 15:55:59 crc kubenswrapper[4771]: E0319 15:55:59.510503 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:56:00 crc kubenswrapper[4771]: I0319 15:56:00.154289 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565596-gknh6"] Mar 19 15:56:00 crc kubenswrapper[4771]: E0319 15:56:00.155019 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a23432-56c1-4062-b0cd-bad1e6bed575" containerName="registry-server" Mar 19 15:56:00 crc kubenswrapper[4771]: I0319 15:56:00.155036 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a23432-56c1-4062-b0cd-bad1e6bed575" containerName="registry-server" Mar 19 15:56:00 crc kubenswrapper[4771]: E0319 15:56:00.155055 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a23432-56c1-4062-b0cd-bad1e6bed575" containerName="extract-content" Mar 19 15:56:00 crc kubenswrapper[4771]: I0319 15:56:00.155062 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a23432-56c1-4062-b0cd-bad1e6bed575" containerName="extract-content" Mar 19 15:56:00 crc kubenswrapper[4771]: E0319 15:56:00.155086 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a23432-56c1-4062-b0cd-bad1e6bed575" containerName="extract-utilities" Mar 19 15:56:00 crc kubenswrapper[4771]: I0319 15:56:00.155094 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a23432-56c1-4062-b0cd-bad1e6bed575" containerName="extract-utilities" Mar 19 15:56:00 crc kubenswrapper[4771]: I0319 15:56:00.155277 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a23432-56c1-4062-b0cd-bad1e6bed575" containerName="registry-server" Mar 19 15:56:00 crc kubenswrapper[4771]: I0319 15:56:00.155808 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565596-gknh6" Mar 19 15:56:00 crc kubenswrapper[4771]: I0319 15:56:00.158976 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 15:56:00 crc kubenswrapper[4771]: I0319 15:56:00.159052 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k42k7" Mar 19 15:56:00 crc kubenswrapper[4771]: I0319 15:56:00.159392 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 15:56:00 crc kubenswrapper[4771]: I0319 15:56:00.183852 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565596-gknh6"] Mar 19 15:56:00 crc kubenswrapper[4771]: I0319 15:56:00.217680 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps4b5\" (UniqueName: \"kubernetes.io/projected/d7206757-9d91-42ee-b7f6-6470ab1ccab7-kube-api-access-ps4b5\") pod \"auto-csr-approver-29565596-gknh6\" (UID: \"d7206757-9d91-42ee-b7f6-6470ab1ccab7\") " pod="openshift-infra/auto-csr-approver-29565596-gknh6" Mar 19 15:56:00 crc kubenswrapper[4771]: I0319 15:56:00.319842 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps4b5\" (UniqueName: \"kubernetes.io/projected/d7206757-9d91-42ee-b7f6-6470ab1ccab7-kube-api-access-ps4b5\") pod \"auto-csr-approver-29565596-gknh6\" (UID: \"d7206757-9d91-42ee-b7f6-6470ab1ccab7\") " pod="openshift-infra/auto-csr-approver-29565596-gknh6" Mar 19 15:56:00 crc kubenswrapper[4771]: I0319 15:56:00.351728 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps4b5\" (UniqueName: \"kubernetes.io/projected/d7206757-9d91-42ee-b7f6-6470ab1ccab7-kube-api-access-ps4b5\") pod \"auto-csr-approver-29565596-gknh6\" (UID: \"d7206757-9d91-42ee-b7f6-6470ab1ccab7\") " pod="openshift-infra/auto-csr-approver-29565596-gknh6" Mar 19 15:56:00 crc kubenswrapper[4771]: I0319 15:56:00.481312 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565596-gknh6" Mar 19 15:56:01 crc kubenswrapper[4771]: I0319 15:56:01.056824 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565596-gknh6"] Mar 19 15:56:01 crc kubenswrapper[4771]: W0319 15:56:01.060638 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7206757_9d91_42ee_b7f6_6470ab1ccab7.slice/crio-d0719e14db4694e953ff54bceb60dfd2c227f67eeac2c4da3a9b6ef48e71ca2c WatchSource:0}: Error finding container d0719e14db4694e953ff54bceb60dfd2c227f67eeac2c4da3a9b6ef48e71ca2c: Status 404 returned error can't find the container with id d0719e14db4694e953ff54bceb60dfd2c227f67eeac2c4da3a9b6ef48e71ca2c Mar 19 15:56:01 crc kubenswrapper[4771]: I0319 15:56:01.063558 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 15:56:01 crc kubenswrapper[4771]: I0319 15:56:01.491300 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565596-gknh6" event={"ID":"d7206757-9d91-42ee-b7f6-6470ab1ccab7","Type":"ContainerStarted","Data":"d0719e14db4694e953ff54bceb60dfd2c227f67eeac2c4da3a9b6ef48e71ca2c"} Mar 19 15:56:02 crc kubenswrapper[4771]: I0319 15:56:02.500516 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565596-gknh6" event={"ID":"d7206757-9d91-42ee-b7f6-6470ab1ccab7","Type":"ContainerStarted","Data":"09e82ae5589b0dce46f27363b47c5a6b9cfa6b71f3bdcd3a9c01a3c53849337a"} Mar 19 15:56:02 crc kubenswrapper[4771]: I0319 15:56:02.514068 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565596-gknh6" podStartSLOduration=1.45460869 podStartE2EDuration="2.514037444s" podCreationTimestamp="2026-03-19 15:56:00 +0000 UTC" firstStartedPulling="2026-03-19 15:56:01.063193079 +0000 UTC m=+2420.291814311" lastFinishedPulling="2026-03-19 15:56:02.122621823 +0000 UTC m=+2421.351243065" observedRunningTime="2026-03-19 15:56:02.513162103 +0000 UTC m=+2421.741783305" watchObservedRunningTime="2026-03-19 15:56:02.514037444 +0000 UTC m=+2421.742658676" Mar 19 15:56:03 crc kubenswrapper[4771]: I0319 15:56:03.512829 4771 generic.go:334] "Generic (PLEG): container finished" podID="d7206757-9d91-42ee-b7f6-6470ab1ccab7" containerID="09e82ae5589b0dce46f27363b47c5a6b9cfa6b71f3bdcd3a9c01a3c53849337a" exitCode=0 Mar 19 15:56:03 crc kubenswrapper[4771]: I0319 15:56:03.534361 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565596-gknh6" event={"ID":"d7206757-9d91-42ee-b7f6-6470ab1ccab7","Type":"ContainerDied","Data":"09e82ae5589b0dce46f27363b47c5a6b9cfa6b71f3bdcd3a9c01a3c53849337a"} Mar 19 15:56:04 crc kubenswrapper[4771]: I0319 15:56:04.509585 4771 scope.go:117] "RemoveContainer" containerID="308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100" Mar 19 15:56:04 crc kubenswrapper[4771]: E0319 15:56:04.510815 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:56:04 crc kubenswrapper[4771]: I0319 15:56:04.902529 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565596-gknh6" Mar 19 15:56:05 crc kubenswrapper[4771]: I0319 15:56:05.008594 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ps4b5\" (UniqueName: \"kubernetes.io/projected/d7206757-9d91-42ee-b7f6-6470ab1ccab7-kube-api-access-ps4b5\") pod \"d7206757-9d91-42ee-b7f6-6470ab1ccab7\" (UID: \"d7206757-9d91-42ee-b7f6-6470ab1ccab7\") " Mar 19 15:56:05 crc kubenswrapper[4771]: I0319 15:56:05.015271 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7206757-9d91-42ee-b7f6-6470ab1ccab7-kube-api-access-ps4b5" (OuterVolumeSpecName: "kube-api-access-ps4b5") pod "d7206757-9d91-42ee-b7f6-6470ab1ccab7" (UID: "d7206757-9d91-42ee-b7f6-6470ab1ccab7"). InnerVolumeSpecName "kube-api-access-ps4b5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:56:05 crc kubenswrapper[4771]: I0319 15:56:05.111322 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ps4b5\" (UniqueName: \"kubernetes.io/projected/d7206757-9d91-42ee-b7f6-6470ab1ccab7-kube-api-access-ps4b5\") on node \"crc\" DevicePath \"\"" Mar 19 15:56:05 crc kubenswrapper[4771]: I0319 15:56:05.536413 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565596-gknh6" event={"ID":"d7206757-9d91-42ee-b7f6-6470ab1ccab7","Type":"ContainerDied","Data":"d0719e14db4694e953ff54bceb60dfd2c227f67eeac2c4da3a9b6ef48e71ca2c"} Mar 19 15:56:05 crc kubenswrapper[4771]: I0319 15:56:05.536450 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0719e14db4694e953ff54bceb60dfd2c227f67eeac2c4da3a9b6ef48e71ca2c" Mar 19 15:56:05 crc kubenswrapper[4771]: I0319 15:56:05.536543 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565596-gknh6" Mar 19 15:56:05 crc kubenswrapper[4771]: I0319 15:56:05.990082 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565590-f82zb"] Mar 19 15:56:05 crc kubenswrapper[4771]: I0319 15:56:05.996513 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565590-f82zb"] Mar 19 15:56:06 crc kubenswrapper[4771]: I0319 15:56:06.509633 4771 scope.go:117] "RemoveContainer" containerID="bf7e543b59d7180ae0704df5c3052b12dbb17a20b7e91128baa2e487fe3cf8d2" Mar 19 15:56:06 crc kubenswrapper[4771]: E0319 15:56:06.510177 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:56:07 crc kubenswrapper[4771]: I0319 15:56:07.528610 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f19c29be-d0ee-4c67-a8d4-340e1d97108f" path="/var/lib/kubelet/pods/f19c29be-d0ee-4c67-a8d4-340e1d97108f/volumes" Mar 19 15:56:14 crc kubenswrapper[4771]: I0319 15:56:14.509813 4771 scope.go:117] "RemoveContainer" containerID="52755e84d949e7fdcf58f4218d02c378b0dad1819547d2ae362e5bdb04354539" Mar 19 15:56:14 crc kubenswrapper[4771]: E0319 15:56:14.510903 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:56:17 crc kubenswrapper[4771]: I0319 15:56:17.509197 4771 scope.go:117] "RemoveContainer" containerID="308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100" Mar 19 15:56:17 crc kubenswrapper[4771]: E0319 15:56:17.510183 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:56:19 crc kubenswrapper[4771]: I0319 15:56:19.509399 4771 scope.go:117] "RemoveContainer" containerID="bf7e543b59d7180ae0704df5c3052b12dbb17a20b7e91128baa2e487fe3cf8d2" Mar 19 15:56:19 crc kubenswrapper[4771]: E0319 15:56:19.509659 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:56:28 crc kubenswrapper[4771]: I0319 15:56:28.509028 4771 scope.go:117] "RemoveContainer" containerID="308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100" Mar 19 15:56:28 crc kubenswrapper[4771]: I0319 15:56:28.509885 4771 scope.go:117] "RemoveContainer" containerID="52755e84d949e7fdcf58f4218d02c378b0dad1819547d2ae362e5bdb04354539" Mar 19 15:56:28 crc kubenswrapper[4771]: E0319 15:56:28.509967 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:56:28 crc kubenswrapper[4771]: E0319 15:56:28.510295 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:56:34 crc kubenswrapper[4771]: I0319 15:56:34.508659 4771 scope.go:117] "RemoveContainer" containerID="bf7e543b59d7180ae0704df5c3052b12dbb17a20b7e91128baa2e487fe3cf8d2" Mar 19 15:56:34 crc kubenswrapper[4771]: E0319 15:56:34.509475 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:56:39 crc kubenswrapper[4771]: I0319 15:56:39.509546 4771 scope.go:117] "RemoveContainer" containerID="308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100" Mar 19 15:56:39 crc kubenswrapper[4771]: E0319 15:56:39.509965 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:56:41 crc kubenswrapper[4771]: I0319 15:56:41.516562 4771 scope.go:117] "RemoveContainer" containerID="52755e84d949e7fdcf58f4218d02c378b0dad1819547d2ae362e5bdb04354539" Mar 19 15:56:41 crc kubenswrapper[4771]: E0319 15:56:41.517226 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:56:47 crc kubenswrapper[4771]: I0319 15:56:47.509731 4771 scope.go:117] "RemoveContainer" containerID="bf7e543b59d7180ae0704df5c3052b12dbb17a20b7e91128baa2e487fe3cf8d2" Mar 19 15:56:47 crc kubenswrapper[4771]: E0319 15:56:47.511170 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:56:52 crc kubenswrapper[4771]: I0319 15:56:52.509500 4771 scope.go:117] "RemoveContainer" containerID="308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100" Mar 19 15:56:52 crc kubenswrapper[4771]: E0319 15:56:52.510563 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:56:53 crc kubenswrapper[4771]: I0319 15:56:53.508737 4771 scope.go:117] "RemoveContainer" containerID="52755e84d949e7fdcf58f4218d02c378b0dad1819547d2ae362e5bdb04354539" Mar 19 15:56:53 crc kubenswrapper[4771]: E0319 15:56:53.509326 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:56:58 crc kubenswrapper[4771]: I0319 15:56:58.509607 4771 scope.go:117] "RemoveContainer" containerID="bf7e543b59d7180ae0704df5c3052b12dbb17a20b7e91128baa2e487fe3cf8d2" Mar 19 15:56:58 crc kubenswrapper[4771]: E0319 15:56:58.510708 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:56:58 crc kubenswrapper[4771]: I0319 15:56:58.888333 4771 scope.go:117] "RemoveContainer" containerID="ee0d28b104df4a1c35428059af049cdbc7b79c6630c3781b1d122d96edfaa250" Mar 19 15:57:06 crc kubenswrapper[4771]: I0319 15:57:06.510181 4771 scope.go:117] "RemoveContainer" containerID="308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100" Mar 19 15:57:06 crc kubenswrapper[4771]: E0319 15:57:06.511329 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:57:07 crc kubenswrapper[4771]: I0319 15:57:07.510387 4771 scope.go:117] "RemoveContainer" containerID="52755e84d949e7fdcf58f4218d02c378b0dad1819547d2ae362e5bdb04354539" Mar 19 15:57:07 crc kubenswrapper[4771]: E0319 15:57:07.510828 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:57:09 crc kubenswrapper[4771]: I0319 15:57:09.508707 4771 scope.go:117] "RemoveContainer" containerID="bf7e543b59d7180ae0704df5c3052b12dbb17a20b7e91128baa2e487fe3cf8d2" Mar 19 15:57:09 crc kubenswrapper[4771]: E0319 15:57:09.509214 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:57:19 crc kubenswrapper[4771]: I0319 15:57:19.509630 4771 scope.go:117] "RemoveContainer" containerID="308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100" Mar 19 15:57:19 crc kubenswrapper[4771]: E0319 15:57:19.510821 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:57:21 crc kubenswrapper[4771]: I0319 15:57:21.508965 4771 scope.go:117] "RemoveContainer" containerID="52755e84d949e7fdcf58f4218d02c378b0dad1819547d2ae362e5bdb04354539" Mar 19 15:57:22 crc kubenswrapper[4771]: I0319 15:57:22.307032 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c065c328-37e2-4905-9d1e-82208eab196e","Type":"ContainerStarted","Data":"7ec8ea23d290438d416d09480f0a5d4e2c61db306849abba08e41bd6c1bb255e"} Mar 19 15:57:22 crc kubenswrapper[4771]: I0319 15:57:22.307638 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 19 15:57:23 crc kubenswrapper[4771]: I0319 15:57:23.508900 4771 scope.go:117] "RemoveContainer" containerID="bf7e543b59d7180ae0704df5c3052b12dbb17a20b7e91128baa2e487fe3cf8d2" Mar 19 15:57:24 crc kubenswrapper[4771]: I0319 15:57:24.325198 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74c5f622-0ced-47f9-80d5-75a09acfafc0","Type":"ContainerStarted","Data":"213ef379e4f1221ae4bc0a8ac99389e1ee5177d9a2e1193f2f927ed71da2d7d6"} Mar 19 15:57:24 crc kubenswrapper[4771]: I0319 15:57:24.325764 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 19 15:57:26 crc kubenswrapper[4771]: I0319 15:57:26.346132 4771 generic.go:334] "Generic (PLEG): container finished" podID="c065c328-37e2-4905-9d1e-82208eab196e" containerID="7ec8ea23d290438d416d09480f0a5d4e2c61db306849abba08e41bd6c1bb255e" exitCode=0 Mar 19 15:57:26 crc kubenswrapper[4771]: I0319 15:57:26.346207 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c065c328-37e2-4905-9d1e-82208eab196e","Type":"ContainerDied","Data":"7ec8ea23d290438d416d09480f0a5d4e2c61db306849abba08e41bd6c1bb255e"} Mar 19 15:57:26 crc kubenswrapper[4771]: I0319 15:57:26.346456 4771 scope.go:117] "RemoveContainer" containerID="52755e84d949e7fdcf58f4218d02c378b0dad1819547d2ae362e5bdb04354539" Mar 19 15:57:26 crc kubenswrapper[4771]: I0319 15:57:26.347421 4771 scope.go:117] "RemoveContainer" containerID="7ec8ea23d290438d416d09480f0a5d4e2c61db306849abba08e41bd6c1bb255e" Mar 19 15:57:26 crc kubenswrapper[4771]: E0319 15:57:26.347819 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:57:28 crc kubenswrapper[4771]: I0319 15:57:28.372442 4771 generic.go:334] "Generic (PLEG): container finished" podID="74c5f622-0ced-47f9-80d5-75a09acfafc0" containerID="213ef379e4f1221ae4bc0a8ac99389e1ee5177d9a2e1193f2f927ed71da2d7d6" exitCode=0 Mar 19 15:57:28 crc kubenswrapper[4771]: I0319 15:57:28.372500 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74c5f622-0ced-47f9-80d5-75a09acfafc0","Type":"ContainerDied","Data":"213ef379e4f1221ae4bc0a8ac99389e1ee5177d9a2e1193f2f927ed71da2d7d6"} Mar 19 15:57:28 crc kubenswrapper[4771]: I0319 15:57:28.372537 4771 scope.go:117] "RemoveContainer" containerID="bf7e543b59d7180ae0704df5c3052b12dbb17a20b7e91128baa2e487fe3cf8d2" Mar 19 15:57:28 crc kubenswrapper[4771]: I0319 15:57:28.373136 4771 scope.go:117] "RemoveContainer" containerID="213ef379e4f1221ae4bc0a8ac99389e1ee5177d9a2e1193f2f927ed71da2d7d6" Mar 19 15:57:28 crc kubenswrapper[4771]: E0319 15:57:28.373468 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:57:31 crc kubenswrapper[4771]: I0319 15:57:31.515616 4771 scope.go:117] "RemoveContainer" containerID="308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100" Mar 19 15:57:31 crc kubenswrapper[4771]: E0319 15:57:31.516651 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:57:40 crc kubenswrapper[4771]: I0319 15:57:40.509063 4771 scope.go:117] "RemoveContainer" containerID="7ec8ea23d290438d416d09480f0a5d4e2c61db306849abba08e41bd6c1bb255e" Mar 19 15:57:40 crc kubenswrapper[4771]: I0319 15:57:40.509567 4771 scope.go:117] "RemoveContainer" containerID="213ef379e4f1221ae4bc0a8ac99389e1ee5177d9a2e1193f2f927ed71da2d7d6" Mar 19 15:57:40 crc kubenswrapper[4771]: E0319 15:57:40.509655 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:57:40 crc kubenswrapper[4771]: E0319 15:57:40.510036 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:57:42 crc kubenswrapper[4771]: I0319 15:57:42.508973 4771 scope.go:117] "RemoveContainer" containerID="308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100" Mar 19 15:57:42 crc kubenswrapper[4771]: E0319 15:57:42.509920 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:57:52 crc kubenswrapper[4771]: I0319 15:57:52.509408 4771 scope.go:117] "RemoveContainer" containerID="213ef379e4f1221ae4bc0a8ac99389e1ee5177d9a2e1193f2f927ed71da2d7d6" Mar 19 15:57:52 crc kubenswrapper[4771]: E0319 15:57:52.510451 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:57:53 crc kubenswrapper[4771]: I0319 15:57:53.509206 4771 scope.go:117] "RemoveContainer" containerID="308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100" Mar 19 15:57:53 crc kubenswrapper[4771]: E0319 15:57:53.509483 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:57:54 crc kubenswrapper[4771]: I0319 15:57:54.508908 4771 scope.go:117] "RemoveContainer" containerID="7ec8ea23d290438d416d09480f0a5d4e2c61db306849abba08e41bd6c1bb255e" Mar 19 15:57:54 crc kubenswrapper[4771]: E0319 15:57:54.509227 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:58:00 crc kubenswrapper[4771]: I0319 15:58:00.144678 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565598-6tfnn"] Mar 19 15:58:00 crc kubenswrapper[4771]: E0319 15:58:00.145665 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7206757-9d91-42ee-b7f6-6470ab1ccab7" containerName="oc" Mar 19 15:58:00 crc kubenswrapper[4771]: I0319 15:58:00.145680 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7206757-9d91-42ee-b7f6-6470ab1ccab7" containerName="oc" Mar 19 15:58:00 crc kubenswrapper[4771]: I0319 15:58:00.145889 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7206757-9d91-42ee-b7f6-6470ab1ccab7" containerName="oc" Mar 19 15:58:00 crc kubenswrapper[4771]: I0319 15:58:00.146603 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565598-6tfnn" Mar 19 15:58:00 crc kubenswrapper[4771]: I0319 15:58:00.149816 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k42k7" Mar 19 15:58:00 crc kubenswrapper[4771]: I0319 15:58:00.150740 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 15:58:00 crc kubenswrapper[4771]: I0319 15:58:00.152520 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 15:58:00 crc kubenswrapper[4771]: I0319 15:58:00.176177 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565598-6tfnn"] Mar 19 15:58:00 crc kubenswrapper[4771]: I0319 15:58:00.225707 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8kql\" (UniqueName: \"kubernetes.io/projected/303cb7b8-fa9b-4e41-856f-4205f162828e-kube-api-access-s8kql\") pod \"auto-csr-approver-29565598-6tfnn\" (UID: \"303cb7b8-fa9b-4e41-856f-4205f162828e\") " pod="openshift-infra/auto-csr-approver-29565598-6tfnn" Mar 19 15:58:00 crc kubenswrapper[4771]: I0319 15:58:00.328154 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8kql\" (UniqueName: \"kubernetes.io/projected/303cb7b8-fa9b-4e41-856f-4205f162828e-kube-api-access-s8kql\") pod \"auto-csr-approver-29565598-6tfnn\" (UID: \"303cb7b8-fa9b-4e41-856f-4205f162828e\") " pod="openshift-infra/auto-csr-approver-29565598-6tfnn" Mar 19 15:58:00 crc kubenswrapper[4771]: I0319 15:58:00.360666 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8kql\" (UniqueName: \"kubernetes.io/projected/303cb7b8-fa9b-4e41-856f-4205f162828e-kube-api-access-s8kql\") pod \"auto-csr-approver-29565598-6tfnn\" (UID: \"303cb7b8-fa9b-4e41-856f-4205f162828e\") " pod="openshift-infra/auto-csr-approver-29565598-6tfnn" Mar 19 15:58:00 crc kubenswrapper[4771]: I0319 15:58:00.478294 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565598-6tfnn" Mar 19 15:58:00 crc kubenswrapper[4771]: I0319 15:58:00.931055 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565598-6tfnn"] Mar 19 15:58:00 crc kubenswrapper[4771]: W0319 15:58:00.943587 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod303cb7b8_fa9b_4e41_856f_4205f162828e.slice/crio-445489a77501fa5200e78028475c599e3bc5ab4b6d893ae09d05bdedbe0ddade WatchSource:0}: Error finding container 445489a77501fa5200e78028475c599e3bc5ab4b6d893ae09d05bdedbe0ddade: Status 404 returned error can't find the container with id 445489a77501fa5200e78028475c599e3bc5ab4b6d893ae09d05bdedbe0ddade Mar 19 15:58:01 crc kubenswrapper[4771]: I0319 15:58:01.715687 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565598-6tfnn" event={"ID":"303cb7b8-fa9b-4e41-856f-4205f162828e","Type":"ContainerStarted","Data":"445489a77501fa5200e78028475c599e3bc5ab4b6d893ae09d05bdedbe0ddade"} Mar 19 15:58:02 crc kubenswrapper[4771]: I0319 15:58:02.725661 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565598-6tfnn" event={"ID":"303cb7b8-fa9b-4e41-856f-4205f162828e","Type":"ContainerStarted","Data":"7437276c07b68d9014ad1324fcf83136a2ceabc1d4f19b08ddf3e230ea9ea494"} Mar 19 15:58:02 crc kubenswrapper[4771]: I0319 15:58:02.750587 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565598-6tfnn" podStartSLOduration=1.42885733 podStartE2EDuration="2.75056049s" podCreationTimestamp="2026-03-19 15:58:00 +0000 UTC" firstStartedPulling="2026-03-19 15:58:00.946679308 +0000 UTC m=+2540.175300530" lastFinishedPulling="2026-03-19 15:58:02.268382448 +0000 UTC m=+2541.497003690" observedRunningTime="2026-03-19 15:58:02.746158524 +0000 UTC m=+2541.974779756" watchObservedRunningTime="2026-03-19 15:58:02.75056049 +0000 UTC m=+2541.979181722" Mar 19 15:58:03 crc kubenswrapper[4771]: I0319 15:58:03.508809 4771 scope.go:117] "RemoveContainer" containerID="213ef379e4f1221ae4bc0a8ac99389e1ee5177d9a2e1193f2f927ed71da2d7d6" Mar 19 15:58:03 crc kubenswrapper[4771]: E0319 15:58:03.509339 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:58:03 crc kubenswrapper[4771]: I0319 15:58:03.740670 4771 generic.go:334] "Generic (PLEG): container finished" podID="303cb7b8-fa9b-4e41-856f-4205f162828e" containerID="7437276c07b68d9014ad1324fcf83136a2ceabc1d4f19b08ddf3e230ea9ea494" exitCode=0 Mar 19 15:58:03 crc kubenswrapper[4771]: I0319 15:58:03.740961 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565598-6tfnn" event={"ID":"303cb7b8-fa9b-4e41-856f-4205f162828e","Type":"ContainerDied","Data":"7437276c07b68d9014ad1324fcf83136a2ceabc1d4f19b08ddf3e230ea9ea494"} Mar 19 15:58:05 crc kubenswrapper[4771]: I0319 15:58:05.176848 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565598-6tfnn" Mar 19 15:58:05 crc kubenswrapper[4771]: I0319 15:58:05.228714 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8kql\" (UniqueName: \"kubernetes.io/projected/303cb7b8-fa9b-4e41-856f-4205f162828e-kube-api-access-s8kql\") pod \"303cb7b8-fa9b-4e41-856f-4205f162828e\" (UID: \"303cb7b8-fa9b-4e41-856f-4205f162828e\") " Mar 19 15:58:05 crc kubenswrapper[4771]: I0319 15:58:05.241149 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/303cb7b8-fa9b-4e41-856f-4205f162828e-kube-api-access-s8kql" (OuterVolumeSpecName: "kube-api-access-s8kql") pod "303cb7b8-fa9b-4e41-856f-4205f162828e" (UID: "303cb7b8-fa9b-4e41-856f-4205f162828e"). InnerVolumeSpecName "kube-api-access-s8kql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 15:58:05 crc kubenswrapper[4771]: I0319 15:58:05.331315 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8kql\" (UniqueName: \"kubernetes.io/projected/303cb7b8-fa9b-4e41-856f-4205f162828e-kube-api-access-s8kql\") on node \"crc\" DevicePath \"\"" Mar 19 15:58:05 crc kubenswrapper[4771]: I0319 15:58:05.786961 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565598-6tfnn" event={"ID":"303cb7b8-fa9b-4e41-856f-4205f162828e","Type":"ContainerDied","Data":"445489a77501fa5200e78028475c599e3bc5ab4b6d893ae09d05bdedbe0ddade"} Mar 19 15:58:05 crc kubenswrapper[4771]: I0319 15:58:05.787092 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="445489a77501fa5200e78028475c599e3bc5ab4b6d893ae09d05bdedbe0ddade" Mar 19 15:58:05 crc kubenswrapper[4771]: I0319 15:58:05.787200 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565598-6tfnn" Mar 19 15:58:05 crc kubenswrapper[4771]: I0319 15:58:05.835612 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565592-vhvjg"] Mar 19 15:58:05 crc kubenswrapper[4771]: I0319 15:58:05.841583 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565592-vhvjg"] Mar 19 15:58:06 crc kubenswrapper[4771]: I0319 15:58:06.508611 4771 scope.go:117] "RemoveContainer" containerID="308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100" Mar 19 15:58:06 crc kubenswrapper[4771]: E0319 15:58:06.508833 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:58:07 crc kubenswrapper[4771]: I0319 15:58:07.508950 4771 scope.go:117] "RemoveContainer" containerID="7ec8ea23d290438d416d09480f0a5d4e2c61db306849abba08e41bd6c1bb255e" Mar 19 15:58:07 crc kubenswrapper[4771]: E0319 15:58:07.509602 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:58:07 crc kubenswrapper[4771]: I0319 15:58:07.525254 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa4e8243-5a8d-43e8-816e-d1f92d947d7b" path="/var/lib/kubelet/pods/aa4e8243-5a8d-43e8-816e-d1f92d947d7b/volumes" Mar 19 15:58:17 crc kubenswrapper[4771]: I0319 15:58:17.509257 4771 scope.go:117] "RemoveContainer" containerID="213ef379e4f1221ae4bc0a8ac99389e1ee5177d9a2e1193f2f927ed71da2d7d6" Mar 19 15:58:17 crc kubenswrapper[4771]: E0319 15:58:17.510197 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:58:18 crc kubenswrapper[4771]: I0319 15:58:18.509119 4771 scope.go:117] "RemoveContainer" containerID="308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100" Mar 19 15:58:18 crc kubenswrapper[4771]: E0319 15:58:18.509573 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:58:18 crc kubenswrapper[4771]: I0319 15:58:18.509714 4771 scope.go:117] "RemoveContainer" containerID="7ec8ea23d290438d416d09480f0a5d4e2c61db306849abba08e41bd6c1bb255e" Mar 19 15:58:18 crc kubenswrapper[4771]: E0319 15:58:18.510161 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:58:29 crc kubenswrapper[4771]: I0319 15:58:29.509212 4771 scope.go:117] "RemoveContainer" containerID="213ef379e4f1221ae4bc0a8ac99389e1ee5177d9a2e1193f2f927ed71da2d7d6" Mar 19 15:58:29 crc kubenswrapper[4771]: I0319 15:58:29.509901 4771 scope.go:117] "RemoveContainer" containerID="7ec8ea23d290438d416d09480f0a5d4e2c61db306849abba08e41bd6c1bb255e" Mar 19 15:58:29 crc kubenswrapper[4771]: E0319 15:58:29.510327 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:58:29 crc kubenswrapper[4771]: E0319 15:58:29.510339 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:58:33 crc kubenswrapper[4771]: I0319 15:58:33.509189 4771 scope.go:117] "RemoveContainer" containerID="308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100" Mar 19 15:58:33 crc kubenswrapper[4771]: E0319 15:58:33.510556 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:58:41 crc kubenswrapper[4771]: I0319 15:58:41.519077 4771 scope.go:117] "RemoveContainer" containerID="213ef379e4f1221ae4bc0a8ac99389e1ee5177d9a2e1193f2f927ed71da2d7d6" Mar 19 15:58:41 crc kubenswrapper[4771]: E0319 15:58:41.519878 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:58:44 crc kubenswrapper[4771]: I0319 15:58:44.509658 4771 scope.go:117] "RemoveContainer" containerID="7ec8ea23d290438d416d09480f0a5d4e2c61db306849abba08e41bd6c1bb255e" Mar 19 15:58:44 crc kubenswrapper[4771]: I0319 15:58:44.510238 4771 scope.go:117] "RemoveContainer" containerID="308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100" Mar 19 15:58:44 crc kubenswrapper[4771]: E0319 15:58:44.510439 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:58:44 crc kubenswrapper[4771]: E0319 15:58:44.510610 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 15:58:53 crc kubenswrapper[4771]: I0319 15:58:53.509449 4771 scope.go:117] "RemoveContainer" containerID="213ef379e4f1221ae4bc0a8ac99389e1ee5177d9a2e1193f2f927ed71da2d7d6" Mar 19 15:58:53 crc kubenswrapper[4771]: E0319 15:58:53.510544 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:58:55 crc kubenswrapper[4771]: I0319 15:58:55.509686 4771 scope.go:117] "RemoveContainer" containerID="308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100" Mar 19 15:58:56 crc kubenswrapper[4771]: I0319 15:58:56.297970 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" event={"ID":"f2b6e948-bbef-4217-b0eb-4cdbf711037c","Type":"ContainerStarted","Data":"ac106972bbc38acc7af7f4fb7db71c92708ab1cefb8d1a1910e914ead40dfa5f"} Mar 19 15:58:58 crc kubenswrapper[4771]: I0319 15:58:58.509683 4771 scope.go:117] "RemoveContainer" containerID="7ec8ea23d290438d416d09480f0a5d4e2c61db306849abba08e41bd6c1bb255e" Mar 19 15:58:58 crc kubenswrapper[4771]: E0319 15:58:58.510888 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:58:58 crc kubenswrapper[4771]: I0319 15:58:58.966483 4771 scope.go:117] "RemoveContainer" containerID="e337875a05a23413148288438d66390227ba7be1e41eb96ec05c723812e9e553" Mar 19 15:59:08 crc kubenswrapper[4771]: I0319 15:59:08.508855 4771 scope.go:117] "RemoveContainer" containerID="213ef379e4f1221ae4bc0a8ac99389e1ee5177d9a2e1193f2f927ed71da2d7d6" Mar 19 15:59:08 crc kubenswrapper[4771]: E0319 15:59:08.510041 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:59:13 crc kubenswrapper[4771]: I0319 15:59:13.508434 4771 scope.go:117] "RemoveContainer" containerID="7ec8ea23d290438d416d09480f0a5d4e2c61db306849abba08e41bd6c1bb255e" Mar 19 15:59:13 crc kubenswrapper[4771]: E0319 15:59:13.509267 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:59:22 crc kubenswrapper[4771]: I0319 15:59:22.509312 4771 scope.go:117] "RemoveContainer" containerID="213ef379e4f1221ae4bc0a8ac99389e1ee5177d9a2e1193f2f927ed71da2d7d6" Mar 19 15:59:22 crc kubenswrapper[4771]: E0319 15:59:22.510267 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:59:24 crc kubenswrapper[4771]: I0319 15:59:24.509304 4771 scope.go:117] "RemoveContainer" containerID="7ec8ea23d290438d416d09480f0a5d4e2c61db306849abba08e41bd6c1bb255e" Mar 19 15:59:24 crc kubenswrapper[4771]: E0319 15:59:24.509561 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:59:34 crc kubenswrapper[4771]: I0319 15:59:34.509544 4771 scope.go:117] "RemoveContainer" containerID="213ef379e4f1221ae4bc0a8ac99389e1ee5177d9a2e1193f2f927ed71da2d7d6" Mar 19 15:59:34 crc kubenswrapper[4771]: E0319 15:59:34.510665 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 15:59:37 crc kubenswrapper[4771]: I0319 15:59:37.510397 4771 scope.go:117] "RemoveContainer" containerID="7ec8ea23d290438d416d09480f0a5d4e2c61db306849abba08e41bd6c1bb255e" Mar 19 15:59:37 crc kubenswrapper[4771]: E0319 15:59:37.510911 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:59:48 crc kubenswrapper[4771]: I0319 15:59:48.509403 4771 scope.go:117] "RemoveContainer" containerID="7ec8ea23d290438d416d09480f0a5d4e2c61db306849abba08e41bd6c1bb255e" Mar 19 15:59:48 crc kubenswrapper[4771]: E0319 15:59:48.510179 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 15:59:49 crc kubenswrapper[4771]: I0319 15:59:49.508538 4771 scope.go:117] "RemoveContainer" containerID="213ef379e4f1221ae4bc0a8ac99389e1ee5177d9a2e1193f2f927ed71da2d7d6" Mar 19 15:59:49 crc kubenswrapper[4771]: E0319 15:59:49.509180 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.145832 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565600-2vlpz"] Mar 19 16:00:00 crc kubenswrapper[4771]: E0319 16:00:00.146687 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="303cb7b8-fa9b-4e41-856f-4205f162828e" containerName="oc" Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.146699 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="303cb7b8-fa9b-4e41-856f-4205f162828e" containerName="oc" Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.146897 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="303cb7b8-fa9b-4e41-856f-4205f162828e" containerName="oc" Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.147501 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565600-2vlpz" Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.149420 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.149810 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.157152 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565600-vtrl5"] Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.158611 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565600-vtrl5" Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.164305 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k42k7" Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.164570 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.164732 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.204611 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565600-2vlpz"] Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.214226 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565600-vtrl5"] Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.246513 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/505381ce-7974-4f3f-8ebb-44a26ff8e7a8-secret-volume\") pod \"collect-profiles-29565600-2vlpz\" (UID: \"505381ce-7974-4f3f-8ebb-44a26ff8e7a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565600-2vlpz" Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.246573 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh4gp\" (UniqueName: \"kubernetes.io/projected/3530f179-780f-44a0-81f5-fb76a255fc0d-kube-api-access-qh4gp\") pod \"auto-csr-approver-29565600-vtrl5\" (UID: \"3530f179-780f-44a0-81f5-fb76a255fc0d\") " pod="openshift-infra/auto-csr-approver-29565600-vtrl5" Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.246720 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/505381ce-7974-4f3f-8ebb-44a26ff8e7a8-config-volume\") pod \"collect-profiles-29565600-2vlpz\" (UID: \"505381ce-7974-4f3f-8ebb-44a26ff8e7a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565600-2vlpz" Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.246825 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx2m5\" (UniqueName: \"kubernetes.io/projected/505381ce-7974-4f3f-8ebb-44a26ff8e7a8-kube-api-access-jx2m5\") pod \"collect-profiles-29565600-2vlpz\" (UID: \"505381ce-7974-4f3f-8ebb-44a26ff8e7a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565600-2vlpz" Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.348047 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh4gp\" (UniqueName: \"kubernetes.io/projected/3530f179-780f-44a0-81f5-fb76a255fc0d-kube-api-access-qh4gp\") pod \"auto-csr-approver-29565600-vtrl5\" (UID: \"3530f179-780f-44a0-81f5-fb76a255fc0d\") " pod="openshift-infra/auto-csr-approver-29565600-vtrl5" Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.348126 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/505381ce-7974-4f3f-8ebb-44a26ff8e7a8-config-volume\") pod \"collect-profiles-29565600-2vlpz\" (UID: \"505381ce-7974-4f3f-8ebb-44a26ff8e7a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565600-2vlpz" Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.348212 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx2m5\" (UniqueName: \"kubernetes.io/projected/505381ce-7974-4f3f-8ebb-44a26ff8e7a8-kube-api-access-jx2m5\") pod \"collect-profiles-29565600-2vlpz\" (UID: \"505381ce-7974-4f3f-8ebb-44a26ff8e7a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565600-2vlpz" Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.348355 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/505381ce-7974-4f3f-8ebb-44a26ff8e7a8-secret-volume\") pod \"collect-profiles-29565600-2vlpz\" (UID: \"505381ce-7974-4f3f-8ebb-44a26ff8e7a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565600-2vlpz" Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.349478 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/505381ce-7974-4f3f-8ebb-44a26ff8e7a8-config-volume\") pod \"collect-profiles-29565600-2vlpz\" (UID: \"505381ce-7974-4f3f-8ebb-44a26ff8e7a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565600-2vlpz" Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.356110 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/505381ce-7974-4f3f-8ebb-44a26ff8e7a8-secret-volume\") pod \"collect-profiles-29565600-2vlpz\" (UID: \"505381ce-7974-4f3f-8ebb-44a26ff8e7a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565600-2vlpz" Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.376659 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh4gp\" (UniqueName: \"kubernetes.io/projected/3530f179-780f-44a0-81f5-fb76a255fc0d-kube-api-access-qh4gp\") pod \"auto-csr-approver-29565600-vtrl5\" (UID: \"3530f179-780f-44a0-81f5-fb76a255fc0d\") " pod="openshift-infra/auto-csr-approver-29565600-vtrl5" Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.377578 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx2m5\" (UniqueName: \"kubernetes.io/projected/505381ce-7974-4f3f-8ebb-44a26ff8e7a8-kube-api-access-jx2m5\") pod \"collect-profiles-29565600-2vlpz\" (UID: \"505381ce-7974-4f3f-8ebb-44a26ff8e7a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565600-2vlpz" Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.505162 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565600-2vlpz" Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.508447 4771 scope.go:117] "RemoveContainer" containerID="213ef379e4f1221ae4bc0a8ac99389e1ee5177d9a2e1193f2f927ed71da2d7d6" Mar 19 16:00:00 crc kubenswrapper[4771]: E0319 16:00:00.508798 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.509719 4771 scope.go:117] "RemoveContainer" containerID="7ec8ea23d290438d416d09480f0a5d4e2c61db306849abba08e41bd6c1bb255e" Mar 19 16:00:00 crc kubenswrapper[4771]: E0319 16:00:00.510436 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.512750 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565600-vtrl5" Mar 19 16:00:00 crc kubenswrapper[4771]: I0319 16:00:00.977610 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565600-2vlpz"] Mar 19 16:00:01 crc kubenswrapper[4771]: I0319 16:00:01.013427 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565600-vtrl5"] Mar 19 16:00:01 crc kubenswrapper[4771]: W0319 16:00:01.017293 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3530f179_780f_44a0_81f5_fb76a255fc0d.slice/crio-e6f250553469f192c8f46a52410884b080d99d7c4caaec4b80e769397f9ec651 WatchSource:0}: Error finding container e6f250553469f192c8f46a52410884b080d99d7c4caaec4b80e769397f9ec651: Status 404 returned error can't find the container with id e6f250553469f192c8f46a52410884b080d99d7c4caaec4b80e769397f9ec651 Mar 19 16:00:01 crc kubenswrapper[4771]: I0319 16:00:01.932301 4771 generic.go:334] "Generic (PLEG): container finished" podID="505381ce-7974-4f3f-8ebb-44a26ff8e7a8" containerID="4e8b7f8a52238e6a4950beeea77680be0e972844b2424f0d7b8540deb84980f1" exitCode=0 Mar 19 16:00:01 crc kubenswrapper[4771]: I0319 16:00:01.932368 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565600-2vlpz" event={"ID":"505381ce-7974-4f3f-8ebb-44a26ff8e7a8","Type":"ContainerDied","Data":"4e8b7f8a52238e6a4950beeea77680be0e972844b2424f0d7b8540deb84980f1"} Mar 19 16:00:01 crc kubenswrapper[4771]: I0319 16:00:01.932785 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565600-2vlpz" event={"ID":"505381ce-7974-4f3f-8ebb-44a26ff8e7a8","Type":"ContainerStarted","Data":"7c26aaad4a7314d271249065a206ddfc7879ef10a5376df37082efcaffdb323f"} Mar 19 16:00:01 crc kubenswrapper[4771]: I0319 16:00:01.935701 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565600-vtrl5" event={"ID":"3530f179-780f-44a0-81f5-fb76a255fc0d","Type":"ContainerStarted","Data":"e6f250553469f192c8f46a52410884b080d99d7c4caaec4b80e769397f9ec651"} Mar 19 16:00:03 crc kubenswrapper[4771]: I0319 16:00:03.269416 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565600-2vlpz" Mar 19 16:00:03 crc kubenswrapper[4771]: I0319 16:00:03.399883 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/505381ce-7974-4f3f-8ebb-44a26ff8e7a8-config-volume\") pod \"505381ce-7974-4f3f-8ebb-44a26ff8e7a8\" (UID: \"505381ce-7974-4f3f-8ebb-44a26ff8e7a8\") " Mar 19 16:00:03 crc kubenswrapper[4771]: I0319 16:00:03.399922 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/505381ce-7974-4f3f-8ebb-44a26ff8e7a8-secret-volume\") pod \"505381ce-7974-4f3f-8ebb-44a26ff8e7a8\" (UID: \"505381ce-7974-4f3f-8ebb-44a26ff8e7a8\") " Mar 19 16:00:03 crc kubenswrapper[4771]: I0319 16:00:03.399948 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx2m5\" (UniqueName: \"kubernetes.io/projected/505381ce-7974-4f3f-8ebb-44a26ff8e7a8-kube-api-access-jx2m5\") pod \"505381ce-7974-4f3f-8ebb-44a26ff8e7a8\" (UID: \"505381ce-7974-4f3f-8ebb-44a26ff8e7a8\") " Mar 19 16:00:03 crc kubenswrapper[4771]: I0319 16:00:03.401733 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/505381ce-7974-4f3f-8ebb-44a26ff8e7a8-config-volume" (OuterVolumeSpecName: "config-volume") pod "505381ce-7974-4f3f-8ebb-44a26ff8e7a8" (UID: "505381ce-7974-4f3f-8ebb-44a26ff8e7a8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:00:03 crc kubenswrapper[4771]: I0319 16:00:03.406080 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/505381ce-7974-4f3f-8ebb-44a26ff8e7a8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "505381ce-7974-4f3f-8ebb-44a26ff8e7a8" (UID: "505381ce-7974-4f3f-8ebb-44a26ff8e7a8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:00:03 crc kubenswrapper[4771]: I0319 16:00:03.407463 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/505381ce-7974-4f3f-8ebb-44a26ff8e7a8-kube-api-access-jx2m5" (OuterVolumeSpecName: "kube-api-access-jx2m5") pod "505381ce-7974-4f3f-8ebb-44a26ff8e7a8" (UID: "505381ce-7974-4f3f-8ebb-44a26ff8e7a8"). InnerVolumeSpecName "kube-api-access-jx2m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:00:03 crc kubenswrapper[4771]: I0319 16:00:03.502088 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/505381ce-7974-4f3f-8ebb-44a26ff8e7a8-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 16:00:03 crc kubenswrapper[4771]: I0319 16:00:03.502116 4771 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/505381ce-7974-4f3f-8ebb-44a26ff8e7a8-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 16:00:03 crc kubenswrapper[4771]: I0319 16:00:03.502126 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx2m5\" (UniqueName: \"kubernetes.io/projected/505381ce-7974-4f3f-8ebb-44a26ff8e7a8-kube-api-access-jx2m5\") on node \"crc\" DevicePath \"\"" Mar 19 16:00:03 crc kubenswrapper[4771]: I0319 16:00:03.953165 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565600-2vlpz" event={"ID":"505381ce-7974-4f3f-8ebb-44a26ff8e7a8","Type":"ContainerDied","Data":"7c26aaad4a7314d271249065a206ddfc7879ef10a5376df37082efcaffdb323f"} Mar 19 16:00:03 crc kubenswrapper[4771]: I0319 16:00:03.953204 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565600-2vlpz" Mar 19 16:00:03 crc kubenswrapper[4771]: I0319 16:00:03.953211 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c26aaad4a7314d271249065a206ddfc7879ef10a5376df37082efcaffdb323f" Mar 19 16:00:04 crc kubenswrapper[4771]: I0319 16:00:04.370577 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565555-5tnbp"] Mar 19 16:00:04 crc kubenswrapper[4771]: I0319 16:00:04.376332 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565555-5tnbp"] Mar 19 16:00:05 crc kubenswrapper[4771]: I0319 16:00:05.518710 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c846868-7bc7-4aca-b36c-b8e85cc31ac2" path="/var/lib/kubelet/pods/7c846868-7bc7-4aca-b36c-b8e85cc31ac2/volumes" Mar 19 16:00:05 crc kubenswrapper[4771]: I0319 16:00:05.978436 4771 generic.go:334] "Generic (PLEG): container finished" podID="3530f179-780f-44a0-81f5-fb76a255fc0d" containerID="b905096a4d7d5ef3e2300501e6f714fad43e606505c9a360c770d5f8b007f72a" exitCode=0 Mar 19 16:00:05 crc kubenswrapper[4771]: I0319 16:00:05.978492 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565600-vtrl5" event={"ID":"3530f179-780f-44a0-81f5-fb76a255fc0d","Type":"ContainerDied","Data":"b905096a4d7d5ef3e2300501e6f714fad43e606505c9a360c770d5f8b007f72a"} Mar 19 16:00:07 crc kubenswrapper[4771]: I0319 16:00:07.422833 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565600-vtrl5" Mar 19 16:00:07 crc kubenswrapper[4771]: I0319 16:00:07.481313 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh4gp\" (UniqueName: \"kubernetes.io/projected/3530f179-780f-44a0-81f5-fb76a255fc0d-kube-api-access-qh4gp\") pod \"3530f179-780f-44a0-81f5-fb76a255fc0d\" (UID: \"3530f179-780f-44a0-81f5-fb76a255fc0d\") " Mar 19 16:00:07 crc kubenswrapper[4771]: I0319 16:00:07.487626 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3530f179-780f-44a0-81f5-fb76a255fc0d-kube-api-access-qh4gp" (OuterVolumeSpecName: "kube-api-access-qh4gp") pod "3530f179-780f-44a0-81f5-fb76a255fc0d" (UID: "3530f179-780f-44a0-81f5-fb76a255fc0d"). InnerVolumeSpecName "kube-api-access-qh4gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:00:07 crc kubenswrapper[4771]: I0319 16:00:07.583243 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh4gp\" (UniqueName: \"kubernetes.io/projected/3530f179-780f-44a0-81f5-fb76a255fc0d-kube-api-access-qh4gp\") on node \"crc\" DevicePath \"\"" Mar 19 16:00:08 crc kubenswrapper[4771]: I0319 16:00:08.000673 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565600-vtrl5" event={"ID":"3530f179-780f-44a0-81f5-fb76a255fc0d","Type":"ContainerDied","Data":"e6f250553469f192c8f46a52410884b080d99d7c4caaec4b80e769397f9ec651"} Mar 19 16:00:08 crc kubenswrapper[4771]: I0319 16:00:08.000741 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6f250553469f192c8f46a52410884b080d99d7c4caaec4b80e769397f9ec651" Mar 19 16:00:08 crc kubenswrapper[4771]: I0319 16:00:08.001433 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565600-vtrl5" Mar 19 16:00:08 crc kubenswrapper[4771]: I0319 16:00:08.497831 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565594-f7lhs"] Mar 19 16:00:08 crc kubenswrapper[4771]: I0319 16:00:08.509653 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565594-f7lhs"] Mar 19 16:00:09 crc kubenswrapper[4771]: I0319 16:00:09.519753 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acbc2604-3df1-40dc-a50b-7c516c8ee3c6" path="/var/lib/kubelet/pods/acbc2604-3df1-40dc-a50b-7c516c8ee3c6/volumes" Mar 19 16:00:13 crc kubenswrapper[4771]: I0319 16:00:13.508818 4771 scope.go:117] "RemoveContainer" containerID="7ec8ea23d290438d416d09480f0a5d4e2c61db306849abba08e41bd6c1bb255e" Mar 19 16:00:13 crc kubenswrapper[4771]: E0319 16:00:13.509258 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:00:14 crc kubenswrapper[4771]: I0319 16:00:14.509886 4771 scope.go:117] "RemoveContainer" containerID="213ef379e4f1221ae4bc0a8ac99389e1ee5177d9a2e1193f2f927ed71da2d7d6" Mar 19 16:00:14 crc kubenswrapper[4771]: E0319 16:00:14.510541 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:00:25 crc kubenswrapper[4771]: I0319 16:00:25.508821 4771 scope.go:117] "RemoveContainer" containerID="7ec8ea23d290438d416d09480f0a5d4e2c61db306849abba08e41bd6c1bb255e" Mar 19 16:00:25 crc kubenswrapper[4771]: E0319 16:00:25.509593 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:00:26 crc kubenswrapper[4771]: I0319 16:00:26.508326 4771 scope.go:117] "RemoveContainer" containerID="213ef379e4f1221ae4bc0a8ac99389e1ee5177d9a2e1193f2f927ed71da2d7d6" Mar 19 16:00:26 crc kubenswrapper[4771]: E0319 16:00:26.508560 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:00:37 crc kubenswrapper[4771]: I0319 16:00:37.509839 4771 scope.go:117] "RemoveContainer" containerID="7ec8ea23d290438d416d09480f0a5d4e2c61db306849abba08e41bd6c1bb255e" Mar 19 16:00:37 crc kubenswrapper[4771]: E0319 16:00:37.511204 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:00:39 crc kubenswrapper[4771]: I0319 16:00:39.509528 4771 scope.go:117] "RemoveContainer" containerID="213ef379e4f1221ae4bc0a8ac99389e1ee5177d9a2e1193f2f927ed71da2d7d6" Mar 19 16:00:39 crc kubenswrapper[4771]: E0319 16:00:39.510371 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:00:48 crc kubenswrapper[4771]: I0319 16:00:48.509349 4771 scope.go:117] "RemoveContainer" containerID="7ec8ea23d290438d416d09480f0a5d4e2c61db306849abba08e41bd6c1bb255e" Mar 19 16:00:48 crc kubenswrapper[4771]: E0319 16:00:48.510707 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:00:52 crc kubenswrapper[4771]: I0319 16:00:52.508137 4771 scope.go:117] "RemoveContainer" containerID="213ef379e4f1221ae4bc0a8ac99389e1ee5177d9a2e1193f2f927ed71da2d7d6" Mar 19 16:00:52 crc kubenswrapper[4771]: E0319 16:00:52.508916 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:00:59 crc kubenswrapper[4771]: I0319 16:00:59.073927 4771 scope.go:117] "RemoveContainer" containerID="6d27cff11ca6aa5cb01859ebf55278cc2a592117f476c9260f648dd1cda3c0ef" Mar 19 16:00:59 crc kubenswrapper[4771]: I0319 16:00:59.113445 4771 scope.go:117] "RemoveContainer" containerID="ba30c999595dde6311398027fb6f1ad65a93d653d1ed44df2b81a82c88b8cc19" Mar 19 16:01:01 crc kubenswrapper[4771]: I0319 16:01:01.514604 4771 scope.go:117] "RemoveContainer" containerID="7ec8ea23d290438d416d09480f0a5d4e2c61db306849abba08e41bd6c1bb255e" Mar 19 16:01:01 crc kubenswrapper[4771]: E0319 16:01:01.515700 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:01:03 crc kubenswrapper[4771]: I0319 16:01:03.508685 4771 scope.go:117] "RemoveContainer" containerID="213ef379e4f1221ae4bc0a8ac99389e1ee5177d9a2e1193f2f927ed71da2d7d6" Mar 19 16:01:03 crc kubenswrapper[4771]: E0319 16:01:03.509201 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:01:12 crc kubenswrapper[4771]: I0319 16:01:12.508772 4771 scope.go:117] "RemoveContainer" containerID="7ec8ea23d290438d416d09480f0a5d4e2c61db306849abba08e41bd6c1bb255e" Mar 19 16:01:12 crc kubenswrapper[4771]: E0319 16:01:12.509785 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:01:14 crc kubenswrapper[4771]: I0319 16:01:14.509188 4771 scope.go:117] "RemoveContainer" containerID="213ef379e4f1221ae4bc0a8ac99389e1ee5177d9a2e1193f2f927ed71da2d7d6" Mar 19 16:01:14 crc kubenswrapper[4771]: E0319 16:01:14.510153 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:01:23 crc kubenswrapper[4771]: I0319 16:01:23.027791 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 16:01:23 crc kubenswrapper[4771]: I0319 16:01:23.028371 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 16:01:25 crc kubenswrapper[4771]: I0319 16:01:25.510643 4771 scope.go:117] "RemoveContainer" containerID="7ec8ea23d290438d416d09480f0a5d4e2c61db306849abba08e41bd6c1bb255e" Mar 19 16:01:25 crc kubenswrapper[4771]: E0319 16:01:25.528337 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:01:28 crc kubenswrapper[4771]: I0319 16:01:28.508500 4771 scope.go:117] "RemoveContainer" containerID="213ef379e4f1221ae4bc0a8ac99389e1ee5177d9a2e1193f2f927ed71da2d7d6" Mar 19 16:01:28 crc kubenswrapper[4771]: E0319 16:01:28.509066 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:01:40 crc kubenswrapper[4771]: I0319 16:01:40.509218 4771 scope.go:117] "RemoveContainer" containerID="7ec8ea23d290438d416d09480f0a5d4e2c61db306849abba08e41bd6c1bb255e" Mar 19 16:01:40 crc kubenswrapper[4771]: E0319 16:01:40.510134 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:01:43 crc kubenswrapper[4771]: I0319 16:01:43.509259 4771 scope.go:117] "RemoveContainer" containerID="213ef379e4f1221ae4bc0a8ac99389e1ee5177d9a2e1193f2f927ed71da2d7d6" Mar 19 16:01:43 crc kubenswrapper[4771]: E0319 16:01:43.509833 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:01:53 crc kubenswrapper[4771]: I0319 16:01:53.027213 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 16:01:53 crc kubenswrapper[4771]: I0319 16:01:53.027848 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 16:01:53 crc kubenswrapper[4771]: I0319 16:01:53.591061 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cnzdp"] Mar 19 16:01:53 crc kubenswrapper[4771]: E0319 16:01:53.591669 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="505381ce-7974-4f3f-8ebb-44a26ff8e7a8" containerName="collect-profiles" Mar 19 16:01:53 crc kubenswrapper[4771]: I0319 16:01:53.591786 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="505381ce-7974-4f3f-8ebb-44a26ff8e7a8" containerName="collect-profiles" Mar 19 16:01:53 crc kubenswrapper[4771]: E0319 16:01:53.591909 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3530f179-780f-44a0-81f5-fb76a255fc0d" containerName="oc" Mar 19 16:01:53 crc kubenswrapper[4771]: I0319 16:01:53.592018 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="3530f179-780f-44a0-81f5-fb76a255fc0d" containerName="oc" Mar 19 16:01:53 crc kubenswrapper[4771]: I0319 16:01:53.592275 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="505381ce-7974-4f3f-8ebb-44a26ff8e7a8" containerName="collect-profiles" Mar 19 16:01:53 crc kubenswrapper[4771]: I0319 16:01:53.592385 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="3530f179-780f-44a0-81f5-fb76a255fc0d" containerName="oc" Mar 19 16:01:53 crc kubenswrapper[4771]: I0319 16:01:53.593971 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cnzdp" Mar 19 16:01:53 crc kubenswrapper[4771]: I0319 16:01:53.598820 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhl4r\" (UniqueName: \"kubernetes.io/projected/cb818a19-8e2e-429c-84ec-bbb304a4e219-kube-api-access-rhl4r\") pod \"community-operators-cnzdp\" (UID: \"cb818a19-8e2e-429c-84ec-bbb304a4e219\") " pod="openshift-marketplace/community-operators-cnzdp" Mar 19 16:01:53 crc kubenswrapper[4771]: I0319 16:01:53.599083 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb818a19-8e2e-429c-84ec-bbb304a4e219-catalog-content\") pod \"community-operators-cnzdp\" (UID: \"cb818a19-8e2e-429c-84ec-bbb304a4e219\") " pod="openshift-marketplace/community-operators-cnzdp" Mar 19 16:01:53 crc kubenswrapper[4771]: I0319 16:01:53.599211 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb818a19-8e2e-429c-84ec-bbb304a4e219-utilities\") pod \"community-operators-cnzdp\" (UID: \"cb818a19-8e2e-429c-84ec-bbb304a4e219\") " pod="openshift-marketplace/community-operators-cnzdp" Mar 19 16:01:53 crc kubenswrapper[4771]: I0319 16:01:53.615031 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cnzdp"] Mar 19 16:01:53 crc kubenswrapper[4771]: I0319 16:01:53.700362 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhl4r\" (UniqueName: \"kubernetes.io/projected/cb818a19-8e2e-429c-84ec-bbb304a4e219-kube-api-access-rhl4r\") pod \"community-operators-cnzdp\" (UID: \"cb818a19-8e2e-429c-84ec-bbb304a4e219\") " pod="openshift-marketplace/community-operators-cnzdp" Mar 19 16:01:53 crc kubenswrapper[4771]: I0319 16:01:53.700655 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb818a19-8e2e-429c-84ec-bbb304a4e219-catalog-content\") pod \"community-operators-cnzdp\" (UID: \"cb818a19-8e2e-429c-84ec-bbb304a4e219\") " pod="openshift-marketplace/community-operators-cnzdp" Mar 19 16:01:53 crc kubenswrapper[4771]: I0319 16:01:53.700675 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb818a19-8e2e-429c-84ec-bbb304a4e219-utilities\") pod \"community-operators-cnzdp\" (UID: \"cb818a19-8e2e-429c-84ec-bbb304a4e219\") " pod="openshift-marketplace/community-operators-cnzdp" Mar 19 16:01:53 crc kubenswrapper[4771]: I0319 16:01:53.701384 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb818a19-8e2e-429c-84ec-bbb304a4e219-utilities\") pod \"community-operators-cnzdp\" (UID: \"cb818a19-8e2e-429c-84ec-bbb304a4e219\") " pod="openshift-marketplace/community-operators-cnzdp" Mar 19 16:01:53 crc kubenswrapper[4771]: I0319 16:01:53.701432 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb818a19-8e2e-429c-84ec-bbb304a4e219-catalog-content\") pod \"community-operators-cnzdp\" (UID: \"cb818a19-8e2e-429c-84ec-bbb304a4e219\") " pod="openshift-marketplace/community-operators-cnzdp" Mar 19 16:01:53 crc kubenswrapper[4771]: I0319 16:01:53.722120 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhl4r\" (UniqueName: \"kubernetes.io/projected/cb818a19-8e2e-429c-84ec-bbb304a4e219-kube-api-access-rhl4r\") pod \"community-operators-cnzdp\" (UID: \"cb818a19-8e2e-429c-84ec-bbb304a4e219\") " pod="openshift-marketplace/community-operators-cnzdp" Mar 19 16:01:53 crc kubenswrapper[4771]: I0319 16:01:53.913886 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cnzdp" Mar 19 16:01:54 crc kubenswrapper[4771]: I0319 16:01:54.438342 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cnzdp"] Mar 19 16:01:54 crc kubenswrapper[4771]: I0319 16:01:54.994159 4771 generic.go:334] "Generic (PLEG): container finished" podID="cb818a19-8e2e-429c-84ec-bbb304a4e219" containerID="eb02d687d484ca46640e862c279bd70d09f3f1c837d74cb3244ef742622d336f" exitCode=0 Mar 19 16:01:54 crc kubenswrapper[4771]: I0319 16:01:54.994268 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnzdp" event={"ID":"cb818a19-8e2e-429c-84ec-bbb304a4e219","Type":"ContainerDied","Data":"eb02d687d484ca46640e862c279bd70d09f3f1c837d74cb3244ef742622d336f"} Mar 19 16:01:54 crc kubenswrapper[4771]: I0319 16:01:54.994491 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnzdp" event={"ID":"cb818a19-8e2e-429c-84ec-bbb304a4e219","Type":"ContainerStarted","Data":"4552a99bd3aed3c647c8d3a555b1353861efc961accc5444eca1d63f78e90638"} Mar 19 16:01:54 crc kubenswrapper[4771]: I0319 16:01:54.996246 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 16:01:55 crc kubenswrapper[4771]: I0319 16:01:55.509578 4771 scope.go:117] "RemoveContainer" containerID="7ec8ea23d290438d416d09480f0a5d4e2c61db306849abba08e41bd6c1bb255e" Mar 19 16:01:55 crc kubenswrapper[4771]: E0319 16:01:55.509800 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:01:56 crc kubenswrapper[4771]: I0319 16:01:56.374204 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hdjw9"] Mar 19 16:01:56 crc kubenswrapper[4771]: I0319 16:01:56.376593 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hdjw9" Mar 19 16:01:56 crc kubenswrapper[4771]: I0319 16:01:56.386320 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hdjw9"] Mar 19 16:01:56 crc kubenswrapper[4771]: I0319 16:01:56.508738 4771 scope.go:117] "RemoveContainer" containerID="213ef379e4f1221ae4bc0a8ac99389e1ee5177d9a2e1193f2f927ed71da2d7d6" Mar 19 16:01:56 crc kubenswrapper[4771]: E0319 16:01:56.509064 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:01:56 crc kubenswrapper[4771]: I0319 16:01:56.550131 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jjfj\" (UniqueName: \"kubernetes.io/projected/a06d1591-6aaf-4832-a9d7-705e23eb6614-kube-api-access-9jjfj\") pod \"redhat-marketplace-hdjw9\" (UID: \"a06d1591-6aaf-4832-a9d7-705e23eb6614\") " pod="openshift-marketplace/redhat-marketplace-hdjw9" Mar 19 16:01:56 crc kubenswrapper[4771]: I0319 16:01:56.550192 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a06d1591-6aaf-4832-a9d7-705e23eb6614-utilities\") pod \"redhat-marketplace-hdjw9\" (UID: \"a06d1591-6aaf-4832-a9d7-705e23eb6614\") " pod="openshift-marketplace/redhat-marketplace-hdjw9" Mar 19 16:01:56 crc kubenswrapper[4771]: I0319 16:01:56.550285 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a06d1591-6aaf-4832-a9d7-705e23eb6614-catalog-content\") pod \"redhat-marketplace-hdjw9\" (UID: \"a06d1591-6aaf-4832-a9d7-705e23eb6614\") " pod="openshift-marketplace/redhat-marketplace-hdjw9" Mar 19 16:01:56 crc kubenswrapper[4771]: I0319 16:01:56.651755 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a06d1591-6aaf-4832-a9d7-705e23eb6614-catalog-content\") pod \"redhat-marketplace-hdjw9\" (UID: \"a06d1591-6aaf-4832-a9d7-705e23eb6614\") " pod="openshift-marketplace/redhat-marketplace-hdjw9" Mar 19 16:01:56 crc kubenswrapper[4771]: I0319 16:01:56.652708 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a06d1591-6aaf-4832-a9d7-705e23eb6614-catalog-content\") pod \"redhat-marketplace-hdjw9\" (UID: \"a06d1591-6aaf-4832-a9d7-705e23eb6614\") " pod="openshift-marketplace/redhat-marketplace-hdjw9" Mar 19 16:01:56 crc kubenswrapper[4771]: I0319 16:01:56.653023 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jjfj\" (UniqueName: \"kubernetes.io/projected/a06d1591-6aaf-4832-a9d7-705e23eb6614-kube-api-access-9jjfj\") pod \"redhat-marketplace-hdjw9\" (UID: \"a06d1591-6aaf-4832-a9d7-705e23eb6614\") " pod="openshift-marketplace/redhat-marketplace-hdjw9" Mar 19 16:01:56 crc kubenswrapper[4771]: I0319 16:01:56.653711 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a06d1591-6aaf-4832-a9d7-705e23eb6614-utilities\") pod \"redhat-marketplace-hdjw9\" (UID: \"a06d1591-6aaf-4832-a9d7-705e23eb6614\") " pod="openshift-marketplace/redhat-marketplace-hdjw9" Mar 19 16:01:56 crc kubenswrapper[4771]: I0319 16:01:56.654100 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a06d1591-6aaf-4832-a9d7-705e23eb6614-utilities\") pod \"redhat-marketplace-hdjw9\" (UID: \"a06d1591-6aaf-4832-a9d7-705e23eb6614\") " pod="openshift-marketplace/redhat-marketplace-hdjw9" Mar 19 16:01:56 crc kubenswrapper[4771]: I0319 16:01:56.674025 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jjfj\" (UniqueName: \"kubernetes.io/projected/a06d1591-6aaf-4832-a9d7-705e23eb6614-kube-api-access-9jjfj\") pod \"redhat-marketplace-hdjw9\" (UID: \"a06d1591-6aaf-4832-a9d7-705e23eb6614\") " pod="openshift-marketplace/redhat-marketplace-hdjw9" Mar 19 16:01:56 crc kubenswrapper[4771]: I0319 16:01:56.696290 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hdjw9" Mar 19 16:01:57 crc kubenswrapper[4771]: I0319 16:01:57.010220 4771 generic.go:334] "Generic (PLEG): container finished" podID="cb818a19-8e2e-429c-84ec-bbb304a4e219" containerID="ed97233deaf6e36c7aa538381037005df02f01cc502cef81160369c8914b3358" exitCode=0 Mar 19 16:01:57 crc kubenswrapper[4771]: I0319 16:01:57.010268 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnzdp" event={"ID":"cb818a19-8e2e-429c-84ec-bbb304a4e219","Type":"ContainerDied","Data":"ed97233deaf6e36c7aa538381037005df02f01cc502cef81160369c8914b3358"} Mar 19 16:01:57 crc kubenswrapper[4771]: I0319 16:01:57.151302 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hdjw9"] Mar 19 16:01:57 crc kubenswrapper[4771]: W0319 16:01:57.155886 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda06d1591_6aaf_4832_a9d7_705e23eb6614.slice/crio-57456c2216687b9aa31b49e93aabe67e6710293cf0ec3d78a2d3d398d4b4d4ea WatchSource:0}: Error finding container 57456c2216687b9aa31b49e93aabe67e6710293cf0ec3d78a2d3d398d4b4d4ea: Status 404 returned error can't find the container with id 57456c2216687b9aa31b49e93aabe67e6710293cf0ec3d78a2d3d398d4b4d4ea Mar 19 16:01:58 crc kubenswrapper[4771]: I0319 16:01:58.020371 4771 generic.go:334] "Generic (PLEG): container finished" podID="a06d1591-6aaf-4832-a9d7-705e23eb6614" containerID="6e5d8e8cd6bf14382fcbe9ec78383b0651d9ab64b108b2c8b12339e9a92dbd81" exitCode=0 Mar 19 16:01:58 crc kubenswrapper[4771]: I0319 16:01:58.020474 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hdjw9" event={"ID":"a06d1591-6aaf-4832-a9d7-705e23eb6614","Type":"ContainerDied","Data":"6e5d8e8cd6bf14382fcbe9ec78383b0651d9ab64b108b2c8b12339e9a92dbd81"} Mar 19 16:01:58 crc kubenswrapper[4771]: I0319 16:01:58.020839 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hdjw9" event={"ID":"a06d1591-6aaf-4832-a9d7-705e23eb6614","Type":"ContainerStarted","Data":"57456c2216687b9aa31b49e93aabe67e6710293cf0ec3d78a2d3d398d4b4d4ea"} Mar 19 16:01:58 crc kubenswrapper[4771]: I0319 16:01:58.026276 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnzdp" event={"ID":"cb818a19-8e2e-429c-84ec-bbb304a4e219","Type":"ContainerStarted","Data":"2987c5e81e9103226f73121bef85fbefd1b5cb95a93117fec99794754498f2df"} Mar 19 16:01:58 crc kubenswrapper[4771]: I0319 16:01:58.057177 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cnzdp" podStartSLOduration=2.594617265 podStartE2EDuration="5.057157626s" podCreationTimestamp="2026-03-19 16:01:53 +0000 UTC" firstStartedPulling="2026-03-19 16:01:54.995963836 +0000 UTC m=+2774.224585048" lastFinishedPulling="2026-03-19 16:01:57.458504207 +0000 UTC m=+2776.687125409" observedRunningTime="2026-03-19 16:01:58.053855996 +0000 UTC m=+2777.282477198" watchObservedRunningTime="2026-03-19 16:01:58.057157626 +0000 UTC m=+2777.285778828" Mar 19 16:01:59 crc kubenswrapper[4771]: I0319 16:01:59.034710 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hdjw9" event={"ID":"a06d1591-6aaf-4832-a9d7-705e23eb6614","Type":"ContainerStarted","Data":"633980bf3591d5b87aa58280406a13bdf77117890aa898d4dbe10477711bee70"} Mar 19 16:02:00 crc kubenswrapper[4771]: I0319 16:02:00.044098 4771 generic.go:334] "Generic (PLEG): container finished" podID="a06d1591-6aaf-4832-a9d7-705e23eb6614" containerID="633980bf3591d5b87aa58280406a13bdf77117890aa898d4dbe10477711bee70" exitCode=0 Mar 19 16:02:00 crc kubenswrapper[4771]: I0319 16:02:00.044166 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hdjw9" event={"ID":"a06d1591-6aaf-4832-a9d7-705e23eb6614","Type":"ContainerDied","Data":"633980bf3591d5b87aa58280406a13bdf77117890aa898d4dbe10477711bee70"} Mar 19 16:02:00 crc kubenswrapper[4771]: I0319 16:02:00.159169 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565602-9qsx7"] Mar 19 16:02:00 crc kubenswrapper[4771]: I0319 16:02:00.160117 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565602-9qsx7" Mar 19 16:02:00 crc kubenswrapper[4771]: I0319 16:02:00.162255 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 16:02:00 crc kubenswrapper[4771]: I0319 16:02:00.162318 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 16:02:00 crc kubenswrapper[4771]: I0319 16:02:00.165445 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k42k7" Mar 19 16:02:00 crc kubenswrapper[4771]: I0319 16:02:00.169430 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565602-9qsx7"] Mar 19 16:02:00 crc kubenswrapper[4771]: I0319 16:02:00.321566 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpcwn\" (UniqueName: \"kubernetes.io/projected/e824d290-1b4d-410d-8380-acaa5374e4d2-kube-api-access-wpcwn\") pod \"auto-csr-approver-29565602-9qsx7\" (UID: \"e824d290-1b4d-410d-8380-acaa5374e4d2\") " pod="openshift-infra/auto-csr-approver-29565602-9qsx7" Mar 19 16:02:00 crc kubenswrapper[4771]: I0319 16:02:00.423289 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpcwn\" (UniqueName: \"kubernetes.io/projected/e824d290-1b4d-410d-8380-acaa5374e4d2-kube-api-access-wpcwn\") pod \"auto-csr-approver-29565602-9qsx7\" (UID: \"e824d290-1b4d-410d-8380-acaa5374e4d2\") " pod="openshift-infra/auto-csr-approver-29565602-9qsx7" Mar 19 16:02:00 crc kubenswrapper[4771]: I0319 16:02:00.442581 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpcwn\" (UniqueName: \"kubernetes.io/projected/e824d290-1b4d-410d-8380-acaa5374e4d2-kube-api-access-wpcwn\") pod \"auto-csr-approver-29565602-9qsx7\" (UID: \"e824d290-1b4d-410d-8380-acaa5374e4d2\") " pod="openshift-infra/auto-csr-approver-29565602-9qsx7" Mar 19 16:02:00 crc kubenswrapper[4771]: I0319 16:02:00.479434 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565602-9qsx7" Mar 19 16:02:00 crc kubenswrapper[4771]: I0319 16:02:00.974418 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565602-9qsx7"] Mar 19 16:02:00 crc kubenswrapper[4771]: W0319 16:02:00.979524 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode824d290_1b4d_410d_8380_acaa5374e4d2.slice/crio-44d6659514a00158232df6a731fc6ed158f7d2418979e55e913732f2601a636c WatchSource:0}: Error finding container 44d6659514a00158232df6a731fc6ed158f7d2418979e55e913732f2601a636c: Status 404 returned error can't find the container with id 44d6659514a00158232df6a731fc6ed158f7d2418979e55e913732f2601a636c Mar 19 16:02:01 crc kubenswrapper[4771]: I0319 16:02:01.055063 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565602-9qsx7" event={"ID":"e824d290-1b4d-410d-8380-acaa5374e4d2","Type":"ContainerStarted","Data":"44d6659514a00158232df6a731fc6ed158f7d2418979e55e913732f2601a636c"} Mar 19 16:02:03 crc kubenswrapper[4771]: I0319 16:02:03.071490 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565602-9qsx7" event={"ID":"e824d290-1b4d-410d-8380-acaa5374e4d2","Type":"ContainerStarted","Data":"8f403cb6431ee82ecf8ce8aeaf223d32343e3c7135ce10e25b02efc791ccf1c9"} Mar 19 16:02:03 crc kubenswrapper[4771]: I0319 16:02:03.075774 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hdjw9" event={"ID":"a06d1591-6aaf-4832-a9d7-705e23eb6614","Type":"ContainerStarted","Data":"cf0c0ed93ecabfadfdbcc13422f400473c2738cfd09fd8b53c28a82aefec973b"} Mar 19 16:02:03 crc kubenswrapper[4771]: I0319 16:02:03.093219 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565602-9qsx7" podStartSLOduration=1.517765011 podStartE2EDuration="3.093200089s" podCreationTimestamp="2026-03-19 16:02:00 +0000 UTC" firstStartedPulling="2026-03-19 16:02:00.982357956 +0000 UTC m=+2780.210979188" lastFinishedPulling="2026-03-19 16:02:02.557793064 +0000 UTC m=+2781.786414266" observedRunningTime="2026-03-19 16:02:03.086536237 +0000 UTC m=+2782.315157439" watchObservedRunningTime="2026-03-19 16:02:03.093200089 +0000 UTC m=+2782.321821281" Mar 19 16:02:03 crc kubenswrapper[4771]: I0319 16:02:03.110293 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hdjw9" podStartSLOduration=2.839737081 podStartE2EDuration="7.110274683s" podCreationTimestamp="2026-03-19 16:01:56 +0000 UTC" firstStartedPulling="2026-03-19 16:01:58.025764395 +0000 UTC m=+2777.254385637" lastFinishedPulling="2026-03-19 16:02:02.296302037 +0000 UTC m=+2781.524923239" observedRunningTime="2026-03-19 16:02:03.103777086 +0000 UTC m=+2782.332398308" watchObservedRunningTime="2026-03-19 16:02:03.110274683 +0000 UTC m=+2782.338895885" Mar 19 16:02:03 crc kubenswrapper[4771]: I0319 16:02:03.914210 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cnzdp" Mar 19 16:02:03 crc kubenswrapper[4771]: I0319 16:02:03.914547 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cnzdp" Mar 19 16:02:03 crc kubenswrapper[4771]: I0319 16:02:03.972911 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cnzdp" Mar 19 16:02:04 crc kubenswrapper[4771]: I0319 16:02:04.084600 4771 generic.go:334] "Generic (PLEG): container finished" podID="e824d290-1b4d-410d-8380-acaa5374e4d2" containerID="8f403cb6431ee82ecf8ce8aeaf223d32343e3c7135ce10e25b02efc791ccf1c9" exitCode=0 Mar 19 16:02:04 crc kubenswrapper[4771]: I0319 16:02:04.084682 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565602-9qsx7" event={"ID":"e824d290-1b4d-410d-8380-acaa5374e4d2","Type":"ContainerDied","Data":"8f403cb6431ee82ecf8ce8aeaf223d32343e3c7135ce10e25b02efc791ccf1c9"} Mar 19 16:02:04 crc kubenswrapper[4771]: I0319 16:02:04.131132 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cnzdp" Mar 19 16:02:05 crc kubenswrapper[4771]: I0319 16:02:05.165642 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cnzdp"] Mar 19 16:02:05 crc kubenswrapper[4771]: I0319 16:02:05.400382 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565602-9qsx7" Mar 19 16:02:05 crc kubenswrapper[4771]: I0319 16:02:05.522640 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpcwn\" (UniqueName: \"kubernetes.io/projected/e824d290-1b4d-410d-8380-acaa5374e4d2-kube-api-access-wpcwn\") pod \"e824d290-1b4d-410d-8380-acaa5374e4d2\" (UID: \"e824d290-1b4d-410d-8380-acaa5374e4d2\") " Mar 19 16:02:05 crc kubenswrapper[4771]: I0319 16:02:05.536393 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e824d290-1b4d-410d-8380-acaa5374e4d2-kube-api-access-wpcwn" (OuterVolumeSpecName: "kube-api-access-wpcwn") pod "e824d290-1b4d-410d-8380-acaa5374e4d2" (UID: "e824d290-1b4d-410d-8380-acaa5374e4d2"). InnerVolumeSpecName "kube-api-access-wpcwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:02:05 crc kubenswrapper[4771]: I0319 16:02:05.625171 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpcwn\" (UniqueName: \"kubernetes.io/projected/e824d290-1b4d-410d-8380-acaa5374e4d2-kube-api-access-wpcwn\") on node \"crc\" DevicePath \"\"" Mar 19 16:02:06 crc kubenswrapper[4771]: I0319 16:02:06.108291 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565602-9qsx7" Mar 19 16:02:06 crc kubenswrapper[4771]: I0319 16:02:06.108637 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565602-9qsx7" event={"ID":"e824d290-1b4d-410d-8380-acaa5374e4d2","Type":"ContainerDied","Data":"44d6659514a00158232df6a731fc6ed158f7d2418979e55e913732f2601a636c"} Mar 19 16:02:06 crc kubenswrapper[4771]: I0319 16:02:06.108666 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44d6659514a00158232df6a731fc6ed158f7d2418979e55e913732f2601a636c" Mar 19 16:02:06 crc kubenswrapper[4771]: I0319 16:02:06.108401 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cnzdp" podUID="cb818a19-8e2e-429c-84ec-bbb304a4e219" containerName="registry-server" containerID="cri-o://2987c5e81e9103226f73121bef85fbefd1b5cb95a93117fec99794754498f2df" gracePeriod=2 Mar 19 16:02:06 crc kubenswrapper[4771]: I0319 16:02:06.187530 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565596-gknh6"] Mar 19 16:02:06 crc kubenswrapper[4771]: I0319 16:02:06.199287 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565596-gknh6"] Mar 19 16:02:06 crc kubenswrapper[4771]: I0319 16:02:06.574883 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cnzdp" Mar 19 16:02:06 crc kubenswrapper[4771]: I0319 16:02:06.696693 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hdjw9" Mar 19 16:02:06 crc kubenswrapper[4771]: I0319 16:02:06.696821 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hdjw9" Mar 19 16:02:06 crc kubenswrapper[4771]: I0319 16:02:06.738664 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hdjw9" Mar 19 16:02:06 crc kubenswrapper[4771]: I0319 16:02:06.751732 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb818a19-8e2e-429c-84ec-bbb304a4e219-catalog-content\") pod \"cb818a19-8e2e-429c-84ec-bbb304a4e219\" (UID: \"cb818a19-8e2e-429c-84ec-bbb304a4e219\") " Mar 19 16:02:06 crc kubenswrapper[4771]: I0319 16:02:06.751925 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhl4r\" (UniqueName: \"kubernetes.io/projected/cb818a19-8e2e-429c-84ec-bbb304a4e219-kube-api-access-rhl4r\") pod \"cb818a19-8e2e-429c-84ec-bbb304a4e219\" (UID: \"cb818a19-8e2e-429c-84ec-bbb304a4e219\") " Mar 19 16:02:06 crc kubenswrapper[4771]: I0319 16:02:06.751956 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb818a19-8e2e-429c-84ec-bbb304a4e219-utilities\") pod \"cb818a19-8e2e-429c-84ec-bbb304a4e219\" (UID: \"cb818a19-8e2e-429c-84ec-bbb304a4e219\") " Mar 19 16:02:06 crc kubenswrapper[4771]: I0319 16:02:06.752946 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb818a19-8e2e-429c-84ec-bbb304a4e219-utilities" (OuterVolumeSpecName: "utilities") pod "cb818a19-8e2e-429c-84ec-bbb304a4e219" (UID: "cb818a19-8e2e-429c-84ec-bbb304a4e219"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:02:06 crc kubenswrapper[4771]: I0319 16:02:06.765251 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb818a19-8e2e-429c-84ec-bbb304a4e219-kube-api-access-rhl4r" (OuterVolumeSpecName: "kube-api-access-rhl4r") pod "cb818a19-8e2e-429c-84ec-bbb304a4e219" (UID: "cb818a19-8e2e-429c-84ec-bbb304a4e219"). InnerVolumeSpecName "kube-api-access-rhl4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:02:06 crc kubenswrapper[4771]: I0319 16:02:06.814402 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb818a19-8e2e-429c-84ec-bbb304a4e219-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb818a19-8e2e-429c-84ec-bbb304a4e219" (UID: "cb818a19-8e2e-429c-84ec-bbb304a4e219"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:02:06 crc kubenswrapper[4771]: I0319 16:02:06.854145 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb818a19-8e2e-429c-84ec-bbb304a4e219-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 16:02:06 crc kubenswrapper[4771]: I0319 16:02:06.854186 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhl4r\" (UniqueName: \"kubernetes.io/projected/cb818a19-8e2e-429c-84ec-bbb304a4e219-kube-api-access-rhl4r\") on node \"crc\" DevicePath \"\"" Mar 19 16:02:06 crc kubenswrapper[4771]: I0319 16:02:06.854253 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb818a19-8e2e-429c-84ec-bbb304a4e219-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 16:02:07 crc kubenswrapper[4771]: I0319 16:02:07.118658 4771 generic.go:334] "Generic (PLEG): container finished" podID="cb818a19-8e2e-429c-84ec-bbb304a4e219" containerID="2987c5e81e9103226f73121bef85fbefd1b5cb95a93117fec99794754498f2df" exitCode=0 Mar 19 16:02:07 crc kubenswrapper[4771]: I0319 16:02:07.118734 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cnzdp" Mar 19 16:02:07 crc kubenswrapper[4771]: I0319 16:02:07.118741 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnzdp" event={"ID":"cb818a19-8e2e-429c-84ec-bbb304a4e219","Type":"ContainerDied","Data":"2987c5e81e9103226f73121bef85fbefd1b5cb95a93117fec99794754498f2df"} Mar 19 16:02:07 crc kubenswrapper[4771]: I0319 16:02:07.119213 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnzdp" event={"ID":"cb818a19-8e2e-429c-84ec-bbb304a4e219","Type":"ContainerDied","Data":"4552a99bd3aed3c647c8d3a555b1353861efc961accc5444eca1d63f78e90638"} Mar 19 16:02:07 crc kubenswrapper[4771]: I0319 16:02:07.119266 4771 scope.go:117] "RemoveContainer" containerID="2987c5e81e9103226f73121bef85fbefd1b5cb95a93117fec99794754498f2df" Mar 19 16:02:07 crc kubenswrapper[4771]: I0319 16:02:07.155166 4771 scope.go:117] "RemoveContainer" containerID="ed97233deaf6e36c7aa538381037005df02f01cc502cef81160369c8914b3358" Mar 19 16:02:07 crc kubenswrapper[4771]: I0319 16:02:07.157254 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cnzdp"] Mar 19 16:02:07 crc kubenswrapper[4771]: I0319 16:02:07.167325 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cnzdp"] Mar 19 16:02:07 crc kubenswrapper[4771]: I0319 16:02:07.175860 4771 scope.go:117] "RemoveContainer" containerID="eb02d687d484ca46640e862c279bd70d09f3f1c837d74cb3244ef742622d336f" Mar 19 16:02:07 crc kubenswrapper[4771]: I0319 16:02:07.180811 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hdjw9" Mar 19 16:02:07 crc kubenswrapper[4771]: I0319 16:02:07.219475 4771 scope.go:117] "RemoveContainer" containerID="2987c5e81e9103226f73121bef85fbefd1b5cb95a93117fec99794754498f2df" Mar 19 16:02:07 crc kubenswrapper[4771]: E0319 16:02:07.220011 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2987c5e81e9103226f73121bef85fbefd1b5cb95a93117fec99794754498f2df\": container with ID starting with 2987c5e81e9103226f73121bef85fbefd1b5cb95a93117fec99794754498f2df not found: ID does not exist" containerID="2987c5e81e9103226f73121bef85fbefd1b5cb95a93117fec99794754498f2df" Mar 19 16:02:07 crc kubenswrapper[4771]: I0319 16:02:07.220066 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2987c5e81e9103226f73121bef85fbefd1b5cb95a93117fec99794754498f2df"} err="failed to get container status \"2987c5e81e9103226f73121bef85fbefd1b5cb95a93117fec99794754498f2df\": rpc error: code = NotFound desc = could not find container \"2987c5e81e9103226f73121bef85fbefd1b5cb95a93117fec99794754498f2df\": container with ID starting with 2987c5e81e9103226f73121bef85fbefd1b5cb95a93117fec99794754498f2df not found: ID does not exist" Mar 19 16:02:07 crc kubenswrapper[4771]: I0319 16:02:07.220094 4771 scope.go:117] "RemoveContainer" containerID="ed97233deaf6e36c7aa538381037005df02f01cc502cef81160369c8914b3358" Mar 19 16:02:07 crc kubenswrapper[4771]: E0319 16:02:07.221096 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed97233deaf6e36c7aa538381037005df02f01cc502cef81160369c8914b3358\": container with ID starting with ed97233deaf6e36c7aa538381037005df02f01cc502cef81160369c8914b3358 not found: ID does not exist" containerID="ed97233deaf6e36c7aa538381037005df02f01cc502cef81160369c8914b3358" Mar 19 16:02:07 crc kubenswrapper[4771]: I0319 16:02:07.221130 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed97233deaf6e36c7aa538381037005df02f01cc502cef81160369c8914b3358"} err="failed to get container status \"ed97233deaf6e36c7aa538381037005df02f01cc502cef81160369c8914b3358\": rpc error: code = NotFound desc = could not find container \"ed97233deaf6e36c7aa538381037005df02f01cc502cef81160369c8914b3358\": container with ID starting with ed97233deaf6e36c7aa538381037005df02f01cc502cef81160369c8914b3358 not found: ID does not exist" Mar 19 16:02:07 crc kubenswrapper[4771]: I0319 16:02:07.221150 4771 scope.go:117] "RemoveContainer" containerID="eb02d687d484ca46640e862c279bd70d09f3f1c837d74cb3244ef742622d336f" Mar 19 16:02:07 crc kubenswrapper[4771]: E0319 16:02:07.221554 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb02d687d484ca46640e862c279bd70d09f3f1c837d74cb3244ef742622d336f\": container with ID starting with eb02d687d484ca46640e862c279bd70d09f3f1c837d74cb3244ef742622d336f not found: ID does not exist" containerID="eb02d687d484ca46640e862c279bd70d09f3f1c837d74cb3244ef742622d336f" Mar 19 16:02:07 crc kubenswrapper[4771]: I0319 16:02:07.221578 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb02d687d484ca46640e862c279bd70d09f3f1c837d74cb3244ef742622d336f"} err="failed to get container status \"eb02d687d484ca46640e862c279bd70d09f3f1c837d74cb3244ef742622d336f\": rpc error: code = NotFound desc = could not find container \"eb02d687d484ca46640e862c279bd70d09f3f1c837d74cb3244ef742622d336f\": container with ID starting with eb02d687d484ca46640e862c279bd70d09f3f1c837d74cb3244ef742622d336f not found: ID does not exist" Mar 19 16:02:07 crc kubenswrapper[4771]: I0319 16:02:07.509145 4771 scope.go:117] "RemoveContainer" containerID="213ef379e4f1221ae4bc0a8ac99389e1ee5177d9a2e1193f2f927ed71da2d7d6" Mar 19 16:02:07 crc kubenswrapper[4771]: I0319 16:02:07.509263 4771 scope.go:117] "RemoveContainer" containerID="7ec8ea23d290438d416d09480f0a5d4e2c61db306849abba08e41bd6c1bb255e" Mar 19 16:02:07 crc kubenswrapper[4771]: E0319 16:02:07.509354 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:02:07 crc kubenswrapper[4771]: E0319 16:02:07.509562 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:02:07 crc kubenswrapper[4771]: I0319 16:02:07.517490 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb818a19-8e2e-429c-84ec-bbb304a4e219" path="/var/lib/kubelet/pods/cb818a19-8e2e-429c-84ec-bbb304a4e219/volumes" Mar 19 16:02:07 crc kubenswrapper[4771]: I0319 16:02:07.518114 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7206757-9d91-42ee-b7f6-6470ab1ccab7" path="/var/lib/kubelet/pods/d7206757-9d91-42ee-b7f6-6470ab1ccab7/volumes" Mar 19 16:02:09 crc kubenswrapper[4771]: I0319 16:02:09.166683 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hdjw9"] Mar 19 16:02:10 crc kubenswrapper[4771]: I0319 16:02:10.140740 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hdjw9" podUID="a06d1591-6aaf-4832-a9d7-705e23eb6614" containerName="registry-server" containerID="cri-o://cf0c0ed93ecabfadfdbcc13422f400473c2738cfd09fd8b53c28a82aefec973b" gracePeriod=2 Mar 19 16:02:10 crc kubenswrapper[4771]: I0319 16:02:10.634572 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hdjw9" Mar 19 16:02:10 crc kubenswrapper[4771]: I0319 16:02:10.716790 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a06d1591-6aaf-4832-a9d7-705e23eb6614-utilities\") pod \"a06d1591-6aaf-4832-a9d7-705e23eb6614\" (UID: \"a06d1591-6aaf-4832-a9d7-705e23eb6614\") " Mar 19 16:02:10 crc kubenswrapper[4771]: I0319 16:02:10.716845 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a06d1591-6aaf-4832-a9d7-705e23eb6614-catalog-content\") pod \"a06d1591-6aaf-4832-a9d7-705e23eb6614\" (UID: \"a06d1591-6aaf-4832-a9d7-705e23eb6614\") " Mar 19 16:02:10 crc kubenswrapper[4771]: I0319 16:02:10.717117 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jjfj\" (UniqueName: \"kubernetes.io/projected/a06d1591-6aaf-4832-a9d7-705e23eb6614-kube-api-access-9jjfj\") pod \"a06d1591-6aaf-4832-a9d7-705e23eb6614\" (UID: \"a06d1591-6aaf-4832-a9d7-705e23eb6614\") " Mar 19 16:02:10 crc kubenswrapper[4771]: I0319 16:02:10.718057 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a06d1591-6aaf-4832-a9d7-705e23eb6614-utilities" (OuterVolumeSpecName: "utilities") pod "a06d1591-6aaf-4832-a9d7-705e23eb6614" (UID: "a06d1591-6aaf-4832-a9d7-705e23eb6614"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:02:10 crc kubenswrapper[4771]: I0319 16:02:10.723117 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a06d1591-6aaf-4832-a9d7-705e23eb6614-kube-api-access-9jjfj" (OuterVolumeSpecName: "kube-api-access-9jjfj") pod "a06d1591-6aaf-4832-a9d7-705e23eb6614" (UID: "a06d1591-6aaf-4832-a9d7-705e23eb6614"). InnerVolumeSpecName "kube-api-access-9jjfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:02:10 crc kubenswrapper[4771]: I0319 16:02:10.741553 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a06d1591-6aaf-4832-a9d7-705e23eb6614-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a06d1591-6aaf-4832-a9d7-705e23eb6614" (UID: "a06d1591-6aaf-4832-a9d7-705e23eb6614"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:02:10 crc kubenswrapper[4771]: I0319 16:02:10.818944 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jjfj\" (UniqueName: \"kubernetes.io/projected/a06d1591-6aaf-4832-a9d7-705e23eb6614-kube-api-access-9jjfj\") on node \"crc\" DevicePath \"\"" Mar 19 16:02:10 crc kubenswrapper[4771]: I0319 16:02:10.818980 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a06d1591-6aaf-4832-a9d7-705e23eb6614-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 16:02:10 crc kubenswrapper[4771]: I0319 16:02:10.819017 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a06d1591-6aaf-4832-a9d7-705e23eb6614-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 16:02:11 crc kubenswrapper[4771]: I0319 16:02:11.151290 4771 generic.go:334] "Generic (PLEG): container finished" podID="a06d1591-6aaf-4832-a9d7-705e23eb6614" containerID="cf0c0ed93ecabfadfdbcc13422f400473c2738cfd09fd8b53c28a82aefec973b" exitCode=0 Mar 19 16:02:11 crc kubenswrapper[4771]: I0319 16:02:11.151332 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hdjw9" event={"ID":"a06d1591-6aaf-4832-a9d7-705e23eb6614","Type":"ContainerDied","Data":"cf0c0ed93ecabfadfdbcc13422f400473c2738cfd09fd8b53c28a82aefec973b"} Mar 19 16:02:11 crc kubenswrapper[4771]: I0319 16:02:11.151357 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hdjw9" event={"ID":"a06d1591-6aaf-4832-a9d7-705e23eb6614","Type":"ContainerDied","Data":"57456c2216687b9aa31b49e93aabe67e6710293cf0ec3d78a2d3d398d4b4d4ea"} Mar 19 16:02:11 crc kubenswrapper[4771]: I0319 16:02:11.151372 4771 scope.go:117] "RemoveContainer" containerID="cf0c0ed93ecabfadfdbcc13422f400473c2738cfd09fd8b53c28a82aefec973b" Mar 19 16:02:11 crc kubenswrapper[4771]: I0319 16:02:11.151498 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hdjw9" Mar 19 16:02:11 crc kubenswrapper[4771]: I0319 16:02:11.194477 4771 scope.go:117] "RemoveContainer" containerID="633980bf3591d5b87aa58280406a13bdf77117890aa898d4dbe10477711bee70" Mar 19 16:02:11 crc kubenswrapper[4771]: I0319 16:02:11.199300 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hdjw9"] Mar 19 16:02:11 crc kubenswrapper[4771]: I0319 16:02:11.209607 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hdjw9"] Mar 19 16:02:11 crc kubenswrapper[4771]: I0319 16:02:11.219197 4771 scope.go:117] "RemoveContainer" containerID="6e5d8e8cd6bf14382fcbe9ec78383b0651d9ab64b108b2c8b12339e9a92dbd81" Mar 19 16:02:11 crc kubenswrapper[4771]: I0319 16:02:11.237469 4771 scope.go:117] "RemoveContainer" containerID="cf0c0ed93ecabfadfdbcc13422f400473c2738cfd09fd8b53c28a82aefec973b" Mar 19 16:02:11 crc kubenswrapper[4771]: E0319 16:02:11.238207 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf0c0ed93ecabfadfdbcc13422f400473c2738cfd09fd8b53c28a82aefec973b\": container with ID starting with cf0c0ed93ecabfadfdbcc13422f400473c2738cfd09fd8b53c28a82aefec973b not found: ID does not exist" containerID="cf0c0ed93ecabfadfdbcc13422f400473c2738cfd09fd8b53c28a82aefec973b" Mar 19 16:02:11 crc kubenswrapper[4771]: I0319 16:02:11.238249 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf0c0ed93ecabfadfdbcc13422f400473c2738cfd09fd8b53c28a82aefec973b"} err="failed to get container status \"cf0c0ed93ecabfadfdbcc13422f400473c2738cfd09fd8b53c28a82aefec973b\": rpc error: code = NotFound desc = could not find container \"cf0c0ed93ecabfadfdbcc13422f400473c2738cfd09fd8b53c28a82aefec973b\": container with ID starting with cf0c0ed93ecabfadfdbcc13422f400473c2738cfd09fd8b53c28a82aefec973b not found: ID does not exist" Mar 19 16:02:11 crc kubenswrapper[4771]: I0319 16:02:11.238274 4771 scope.go:117] "RemoveContainer" containerID="633980bf3591d5b87aa58280406a13bdf77117890aa898d4dbe10477711bee70" Mar 19 16:02:11 crc kubenswrapper[4771]: E0319 16:02:11.238665 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"633980bf3591d5b87aa58280406a13bdf77117890aa898d4dbe10477711bee70\": container with ID starting with 633980bf3591d5b87aa58280406a13bdf77117890aa898d4dbe10477711bee70 not found: ID does not exist" containerID="633980bf3591d5b87aa58280406a13bdf77117890aa898d4dbe10477711bee70" Mar 19 16:02:11 crc kubenswrapper[4771]: I0319 16:02:11.238695 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"633980bf3591d5b87aa58280406a13bdf77117890aa898d4dbe10477711bee70"} err="failed to get container status \"633980bf3591d5b87aa58280406a13bdf77117890aa898d4dbe10477711bee70\": rpc error: code = NotFound desc = could not find container \"633980bf3591d5b87aa58280406a13bdf77117890aa898d4dbe10477711bee70\": container with ID starting with 633980bf3591d5b87aa58280406a13bdf77117890aa898d4dbe10477711bee70 not found: ID does not exist" Mar 19 16:02:11 crc kubenswrapper[4771]: I0319 16:02:11.238717 4771 scope.go:117] "RemoveContainer" containerID="6e5d8e8cd6bf14382fcbe9ec78383b0651d9ab64b108b2c8b12339e9a92dbd81" Mar 19 16:02:11 crc kubenswrapper[4771]: E0319 16:02:11.239294 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e5d8e8cd6bf14382fcbe9ec78383b0651d9ab64b108b2c8b12339e9a92dbd81\": container with ID starting with 6e5d8e8cd6bf14382fcbe9ec78383b0651d9ab64b108b2c8b12339e9a92dbd81 not found: ID does not exist" containerID="6e5d8e8cd6bf14382fcbe9ec78383b0651d9ab64b108b2c8b12339e9a92dbd81" Mar 19 16:02:11 crc kubenswrapper[4771]: I0319 16:02:11.239314 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e5d8e8cd6bf14382fcbe9ec78383b0651d9ab64b108b2c8b12339e9a92dbd81"} err="failed to get container status \"6e5d8e8cd6bf14382fcbe9ec78383b0651d9ab64b108b2c8b12339e9a92dbd81\": rpc error: code = NotFound desc = could not find container \"6e5d8e8cd6bf14382fcbe9ec78383b0651d9ab64b108b2c8b12339e9a92dbd81\": container with ID starting with 6e5d8e8cd6bf14382fcbe9ec78383b0651d9ab64b108b2c8b12339e9a92dbd81 not found: ID does not exist" Mar 19 16:02:11 crc kubenswrapper[4771]: I0319 16:02:11.519838 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a06d1591-6aaf-4832-a9d7-705e23eb6614" path="/var/lib/kubelet/pods/a06d1591-6aaf-4832-a9d7-705e23eb6614/volumes" Mar 19 16:02:21 crc kubenswrapper[4771]: I0319 16:02:21.518208 4771 scope.go:117] "RemoveContainer" containerID="213ef379e4f1221ae4bc0a8ac99389e1ee5177d9a2e1193f2f927ed71da2d7d6" Mar 19 16:02:21 crc kubenswrapper[4771]: E0319 16:02:21.518947 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:02:22 crc kubenswrapper[4771]: I0319 16:02:22.509021 4771 scope.go:117] "RemoveContainer" containerID="7ec8ea23d290438d416d09480f0a5d4e2c61db306849abba08e41bd6c1bb255e" Mar 19 16:02:22 crc kubenswrapper[4771]: E0319 16:02:22.509245 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:02:23 crc kubenswrapper[4771]: I0319 16:02:23.028262 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 16:02:23 crc kubenswrapper[4771]: I0319 16:02:23.029721 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 16:02:23 crc kubenswrapper[4771]: I0319 16:02:23.029924 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" Mar 19 16:02:23 crc kubenswrapper[4771]: I0319 16:02:23.031160 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ac106972bbc38acc7af7f4fb7db71c92708ab1cefb8d1a1910e914ead40dfa5f"} pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 16:02:23 crc kubenswrapper[4771]: I0319 16:02:23.031427 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" containerID="cri-o://ac106972bbc38acc7af7f4fb7db71c92708ab1cefb8d1a1910e914ead40dfa5f" gracePeriod=600 Mar 19 16:02:23 crc kubenswrapper[4771]: I0319 16:02:23.257481 4771 generic.go:334] "Generic (PLEG): container finished" podID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerID="ac106972bbc38acc7af7f4fb7db71c92708ab1cefb8d1a1910e914ead40dfa5f" exitCode=0 Mar 19 16:02:23 crc kubenswrapper[4771]: I0319 16:02:23.257535 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" event={"ID":"f2b6e948-bbef-4217-b0eb-4cdbf711037c","Type":"ContainerDied","Data":"ac106972bbc38acc7af7f4fb7db71c92708ab1cefb8d1a1910e914ead40dfa5f"} Mar 19 16:02:23 crc kubenswrapper[4771]: I0319 16:02:23.257581 4771 scope.go:117] "RemoveContainer" containerID="308e698226a1bc2162dde709ff46f22ee4a8cb33c09cb8a9a23bee674e006100" Mar 19 16:02:24 crc kubenswrapper[4771]: I0319 16:02:24.268899 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" event={"ID":"f2b6e948-bbef-4217-b0eb-4cdbf711037c","Type":"ContainerStarted","Data":"968b4f95c0b111096835bcde7002c8b81ceeae3af6b39b2a7dae5f7403f2ce35"} Mar 19 16:02:33 crc kubenswrapper[4771]: I0319 16:02:33.508697 4771 scope.go:117] "RemoveContainer" containerID="213ef379e4f1221ae4bc0a8ac99389e1ee5177d9a2e1193f2f927ed71da2d7d6" Mar 19 16:02:34 crc kubenswrapper[4771]: I0319 16:02:34.363308 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74c5f622-0ced-47f9-80d5-75a09acfafc0","Type":"ContainerStarted","Data":"a6abaaf08e503f1ee61302d3d79f5d8a6acef6453d41a9fac95abf163411a032"} Mar 19 16:02:34 crc kubenswrapper[4771]: I0319 16:02:34.363977 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 19 16:02:34 crc kubenswrapper[4771]: I0319 16:02:34.509262 4771 scope.go:117] "RemoveContainer" containerID="7ec8ea23d290438d416d09480f0a5d4e2c61db306849abba08e41bd6c1bb255e" Mar 19 16:02:35 crc kubenswrapper[4771]: I0319 16:02:35.383152 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c065c328-37e2-4905-9d1e-82208eab196e","Type":"ContainerStarted","Data":"2cb341bcaa4357e1a144f46d483a87bcf66466390a7b674cff57c3727c150855"} Mar 19 16:02:35 crc kubenswrapper[4771]: I0319 16:02:35.384581 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:02:38 crc kubenswrapper[4771]: I0319 16:02:38.428363 4771 generic.go:334] "Generic (PLEG): container finished" podID="74c5f622-0ced-47f9-80d5-75a09acfafc0" containerID="a6abaaf08e503f1ee61302d3d79f5d8a6acef6453d41a9fac95abf163411a032" exitCode=0 Mar 19 16:02:38 crc kubenswrapper[4771]: I0319 16:02:38.428464 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74c5f622-0ced-47f9-80d5-75a09acfafc0","Type":"ContainerDied","Data":"a6abaaf08e503f1ee61302d3d79f5d8a6acef6453d41a9fac95abf163411a032"} Mar 19 16:02:38 crc kubenswrapper[4771]: I0319 16:02:38.428643 4771 scope.go:117] "RemoveContainer" containerID="213ef379e4f1221ae4bc0a8ac99389e1ee5177d9a2e1193f2f927ed71da2d7d6" Mar 19 16:02:38 crc kubenswrapper[4771]: I0319 16:02:38.430182 4771 scope.go:117] "RemoveContainer" containerID="a6abaaf08e503f1ee61302d3d79f5d8a6acef6453d41a9fac95abf163411a032" Mar 19 16:02:38 crc kubenswrapper[4771]: E0319 16:02:38.430692 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:02:39 crc kubenswrapper[4771]: I0319 16:02:39.442828 4771 generic.go:334] "Generic (PLEG): container finished" podID="c065c328-37e2-4905-9d1e-82208eab196e" containerID="2cb341bcaa4357e1a144f46d483a87bcf66466390a7b674cff57c3727c150855" exitCode=0 Mar 19 16:02:39 crc kubenswrapper[4771]: I0319 16:02:39.442930 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c065c328-37e2-4905-9d1e-82208eab196e","Type":"ContainerDied","Data":"2cb341bcaa4357e1a144f46d483a87bcf66466390a7b674cff57c3727c150855"} Mar 19 16:02:39 crc kubenswrapper[4771]: I0319 16:02:39.443112 4771 scope.go:117] "RemoveContainer" containerID="7ec8ea23d290438d416d09480f0a5d4e2c61db306849abba08e41bd6c1bb255e" Mar 19 16:02:39 crc kubenswrapper[4771]: I0319 16:02:39.443644 4771 scope.go:117] "RemoveContainer" containerID="2cb341bcaa4357e1a144f46d483a87bcf66466390a7b674cff57c3727c150855" Mar 19 16:02:39 crc kubenswrapper[4771]: E0319 16:02:39.443836 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:02:50 crc kubenswrapper[4771]: I0319 16:02:50.508654 4771 scope.go:117] "RemoveContainer" containerID="a6abaaf08e503f1ee61302d3d79f5d8a6acef6453d41a9fac95abf163411a032" Mar 19 16:02:50 crc kubenswrapper[4771]: E0319 16:02:50.509378 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:02:54 crc kubenswrapper[4771]: I0319 16:02:54.509427 4771 scope.go:117] "RemoveContainer" containerID="2cb341bcaa4357e1a144f46d483a87bcf66466390a7b674cff57c3727c150855" Mar 19 16:02:54 crc kubenswrapper[4771]: E0319 16:02:54.510297 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:02:59 crc kubenswrapper[4771]: I0319 16:02:59.238025 4771 scope.go:117] "RemoveContainer" containerID="09e82ae5589b0dce46f27363b47c5a6b9cfa6b71f3bdcd3a9c01a3c53849337a" Mar 19 16:03:05 crc kubenswrapper[4771]: I0319 16:03:05.509364 4771 scope.go:117] "RemoveContainer" containerID="a6abaaf08e503f1ee61302d3d79f5d8a6acef6453d41a9fac95abf163411a032" Mar 19 16:03:05 crc kubenswrapper[4771]: E0319 16:03:05.510605 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:03:07 crc kubenswrapper[4771]: I0319 16:03:07.508913 4771 scope.go:117] "RemoveContainer" containerID="2cb341bcaa4357e1a144f46d483a87bcf66466390a7b674cff57c3727c150855" Mar 19 16:03:07 crc kubenswrapper[4771]: E0319 16:03:07.509432 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:03:16 crc kubenswrapper[4771]: I0319 16:03:16.509559 4771 scope.go:117] "RemoveContainer" containerID="a6abaaf08e503f1ee61302d3d79f5d8a6acef6453d41a9fac95abf163411a032" Mar 19 16:03:16 crc kubenswrapper[4771]: E0319 16:03:16.510700 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:03:20 crc kubenswrapper[4771]: I0319 16:03:20.509446 4771 scope.go:117] "RemoveContainer" containerID="2cb341bcaa4357e1a144f46d483a87bcf66466390a7b674cff57c3727c150855" Mar 19 16:03:20 crc kubenswrapper[4771]: E0319 16:03:20.510371 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:03:28 crc kubenswrapper[4771]: I0319 16:03:28.509330 4771 scope.go:117] "RemoveContainer" containerID="a6abaaf08e503f1ee61302d3d79f5d8a6acef6453d41a9fac95abf163411a032" Mar 19 16:03:28 crc kubenswrapper[4771]: E0319 16:03:28.510380 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:03:33 crc kubenswrapper[4771]: I0319 16:03:33.509295 4771 scope.go:117] "RemoveContainer" containerID="2cb341bcaa4357e1a144f46d483a87bcf66466390a7b674cff57c3727c150855" Mar 19 16:03:33 crc kubenswrapper[4771]: E0319 16:03:33.509977 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:03:39 crc kubenswrapper[4771]: I0319 16:03:39.509504 4771 scope.go:117] "RemoveContainer" containerID="a6abaaf08e503f1ee61302d3d79f5d8a6acef6453d41a9fac95abf163411a032" Mar 19 16:03:39 crc kubenswrapper[4771]: E0319 16:03:39.510483 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:03:47 crc kubenswrapper[4771]: I0319 16:03:47.508792 4771 scope.go:117] "RemoveContainer" containerID="2cb341bcaa4357e1a144f46d483a87bcf66466390a7b674cff57c3727c150855" Mar 19 16:03:47 crc kubenswrapper[4771]: E0319 16:03:47.509517 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:03:53 crc kubenswrapper[4771]: I0319 16:03:53.508617 4771 scope.go:117] "RemoveContainer" containerID="a6abaaf08e503f1ee61302d3d79f5d8a6acef6453d41a9fac95abf163411a032" Mar 19 16:03:53 crc kubenswrapper[4771]: E0319 16:03:53.509682 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:04:00 crc kubenswrapper[4771]: I0319 16:04:00.160788 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565604-9jxxb"] Mar 19 16:04:00 crc kubenswrapper[4771]: E0319 16:04:00.163464 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a06d1591-6aaf-4832-a9d7-705e23eb6614" containerName="extract-utilities" Mar 19 16:04:00 crc kubenswrapper[4771]: I0319 16:04:00.163712 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a06d1591-6aaf-4832-a9d7-705e23eb6614" containerName="extract-utilities" Mar 19 16:04:00 crc kubenswrapper[4771]: E0319 16:04:00.163931 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb818a19-8e2e-429c-84ec-bbb304a4e219" containerName="extract-utilities" Mar 19 16:04:00 crc kubenswrapper[4771]: I0319 16:04:00.164193 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb818a19-8e2e-429c-84ec-bbb304a4e219" containerName="extract-utilities" Mar 19 16:04:00 crc kubenswrapper[4771]: E0319 16:04:00.164422 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb818a19-8e2e-429c-84ec-bbb304a4e219" containerName="extract-content" Mar 19 16:04:00 crc kubenswrapper[4771]: I0319 16:04:00.164618 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb818a19-8e2e-429c-84ec-bbb304a4e219" containerName="extract-content" Mar 19 16:04:00 crc kubenswrapper[4771]: E0319 16:04:00.164809 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a06d1591-6aaf-4832-a9d7-705e23eb6614" containerName="extract-content" Mar 19 16:04:00 crc kubenswrapper[4771]: I0319 16:04:00.164950 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a06d1591-6aaf-4832-a9d7-705e23eb6614" containerName="extract-content" Mar 19 16:04:00 crc kubenswrapper[4771]: E0319 16:04:00.165187 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a06d1591-6aaf-4832-a9d7-705e23eb6614" containerName="registry-server" Mar 19 16:04:00 crc kubenswrapper[4771]: I0319 16:04:00.165386 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a06d1591-6aaf-4832-a9d7-705e23eb6614" containerName="registry-server" Mar 19 16:04:00 crc kubenswrapper[4771]: E0319 16:04:00.165593 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e824d290-1b4d-410d-8380-acaa5374e4d2" containerName="oc" Mar 19 16:04:00 crc kubenswrapper[4771]: I0319 16:04:00.165786 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e824d290-1b4d-410d-8380-acaa5374e4d2" containerName="oc" Mar 19 16:04:00 crc kubenswrapper[4771]: E0319 16:04:00.166093 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb818a19-8e2e-429c-84ec-bbb304a4e219" containerName="registry-server" Mar 19 16:04:00 crc kubenswrapper[4771]: I0319 16:04:00.166319 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb818a19-8e2e-429c-84ec-bbb304a4e219" containerName="registry-server" Mar 19 16:04:00 crc kubenswrapper[4771]: I0319 16:04:00.166923 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb818a19-8e2e-429c-84ec-bbb304a4e219" containerName="registry-server" Mar 19 16:04:00 crc kubenswrapper[4771]: I0319 16:04:00.167248 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a06d1591-6aaf-4832-a9d7-705e23eb6614" containerName="registry-server" Mar 19 16:04:00 crc kubenswrapper[4771]: I0319 16:04:00.167470 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e824d290-1b4d-410d-8380-acaa5374e4d2" containerName="oc" Mar 19 16:04:00 crc kubenswrapper[4771]: I0319 16:04:00.168798 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565604-9jxxb" Mar 19 16:04:00 crc kubenswrapper[4771]: I0319 16:04:00.171086 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565604-9jxxb"] Mar 19 16:04:00 crc kubenswrapper[4771]: I0319 16:04:00.172494 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 16:04:00 crc kubenswrapper[4771]: I0319 16:04:00.173598 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 16:04:00 crc kubenswrapper[4771]: I0319 16:04:00.174671 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k42k7" Mar 19 16:04:00 crc kubenswrapper[4771]: I0319 16:04:00.253616 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr6hz\" (UniqueName: \"kubernetes.io/projected/e6e2569e-154b-490c-8ed8-d0143bc882b8-kube-api-access-rr6hz\") pod \"auto-csr-approver-29565604-9jxxb\" (UID: \"e6e2569e-154b-490c-8ed8-d0143bc882b8\") " pod="openshift-infra/auto-csr-approver-29565604-9jxxb" Mar 19 16:04:00 crc kubenswrapper[4771]: I0319 16:04:00.355616 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr6hz\" (UniqueName: \"kubernetes.io/projected/e6e2569e-154b-490c-8ed8-d0143bc882b8-kube-api-access-rr6hz\") pod \"auto-csr-approver-29565604-9jxxb\" (UID: \"e6e2569e-154b-490c-8ed8-d0143bc882b8\") " pod="openshift-infra/auto-csr-approver-29565604-9jxxb" Mar 19 16:04:00 crc kubenswrapper[4771]: I0319 16:04:00.384356 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr6hz\" (UniqueName: \"kubernetes.io/projected/e6e2569e-154b-490c-8ed8-d0143bc882b8-kube-api-access-rr6hz\") pod \"auto-csr-approver-29565604-9jxxb\" (UID: \"e6e2569e-154b-490c-8ed8-d0143bc882b8\") " pod="openshift-infra/auto-csr-approver-29565604-9jxxb" Mar 19 16:04:00 crc kubenswrapper[4771]: I0319 16:04:00.494051 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565604-9jxxb" Mar 19 16:04:01 crc kubenswrapper[4771]: I0319 16:04:01.006582 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565604-9jxxb"] Mar 19 16:04:01 crc kubenswrapper[4771]: W0319 16:04:01.014539 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6e2569e_154b_490c_8ed8_d0143bc882b8.slice/crio-cd0cd40e68cd6ed28bb36c12fbd9bed47c1a7fd4525eddd5e1b1dc9a4f29e3f0 WatchSource:0}: Error finding container cd0cd40e68cd6ed28bb36c12fbd9bed47c1a7fd4525eddd5e1b1dc9a4f29e3f0: Status 404 returned error can't find the container with id cd0cd40e68cd6ed28bb36c12fbd9bed47c1a7fd4525eddd5e1b1dc9a4f29e3f0 Mar 19 16:04:01 crc kubenswrapper[4771]: I0319 16:04:01.437740 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565604-9jxxb" event={"ID":"e6e2569e-154b-490c-8ed8-d0143bc882b8","Type":"ContainerStarted","Data":"cd0cd40e68cd6ed28bb36c12fbd9bed47c1a7fd4525eddd5e1b1dc9a4f29e3f0"} Mar 19 16:04:02 crc kubenswrapper[4771]: I0319 16:04:02.448049 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565604-9jxxb" event={"ID":"e6e2569e-154b-490c-8ed8-d0143bc882b8","Type":"ContainerStarted","Data":"3c1795755ffa0dd8dd316334b0614d70565fd4f845e356364a9691ff179500c3"} Mar 19 16:04:02 crc kubenswrapper[4771]: I0319 16:04:02.509199 4771 scope.go:117] "RemoveContainer" containerID="2cb341bcaa4357e1a144f46d483a87bcf66466390a7b674cff57c3727c150855" Mar 19 16:04:02 crc kubenswrapper[4771]: E0319 16:04:02.509505 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:04:03 crc kubenswrapper[4771]: I0319 16:04:03.461486 4771 generic.go:334] "Generic (PLEG): container finished" podID="e6e2569e-154b-490c-8ed8-d0143bc882b8" containerID="3c1795755ffa0dd8dd316334b0614d70565fd4f845e356364a9691ff179500c3" exitCode=0 Mar 19 16:04:03 crc kubenswrapper[4771]: I0319 16:04:03.461596 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565604-9jxxb" event={"ID":"e6e2569e-154b-490c-8ed8-d0143bc882b8","Type":"ContainerDied","Data":"3c1795755ffa0dd8dd316334b0614d70565fd4f845e356364a9691ff179500c3"} Mar 19 16:04:04 crc kubenswrapper[4771]: I0319 16:04:04.801253 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565604-9jxxb" Mar 19 16:04:04 crc kubenswrapper[4771]: I0319 16:04:04.938614 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr6hz\" (UniqueName: \"kubernetes.io/projected/e6e2569e-154b-490c-8ed8-d0143bc882b8-kube-api-access-rr6hz\") pod \"e6e2569e-154b-490c-8ed8-d0143bc882b8\" (UID: \"e6e2569e-154b-490c-8ed8-d0143bc882b8\") " Mar 19 16:04:04 crc kubenswrapper[4771]: I0319 16:04:04.949538 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6e2569e-154b-490c-8ed8-d0143bc882b8-kube-api-access-rr6hz" (OuterVolumeSpecName: "kube-api-access-rr6hz") pod "e6e2569e-154b-490c-8ed8-d0143bc882b8" (UID: "e6e2569e-154b-490c-8ed8-d0143bc882b8"). InnerVolumeSpecName "kube-api-access-rr6hz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:04:05 crc kubenswrapper[4771]: I0319 16:04:05.040822 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr6hz\" (UniqueName: \"kubernetes.io/projected/e6e2569e-154b-490c-8ed8-d0143bc882b8-kube-api-access-rr6hz\") on node \"crc\" DevicePath \"\"" Mar 19 16:04:05 crc kubenswrapper[4771]: I0319 16:04:05.480386 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565604-9jxxb" event={"ID":"e6e2569e-154b-490c-8ed8-d0143bc882b8","Type":"ContainerDied","Data":"cd0cd40e68cd6ed28bb36c12fbd9bed47c1a7fd4525eddd5e1b1dc9a4f29e3f0"} Mar 19 16:04:05 crc kubenswrapper[4771]: I0319 16:04:05.480431 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd0cd40e68cd6ed28bb36c12fbd9bed47c1a7fd4525eddd5e1b1dc9a4f29e3f0" Mar 19 16:04:05 crc kubenswrapper[4771]: I0319 16:04:05.480508 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565604-9jxxb" Mar 19 16:04:05 crc kubenswrapper[4771]: I0319 16:04:05.894435 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565598-6tfnn"] Mar 19 16:04:05 crc kubenswrapper[4771]: I0319 16:04:05.907287 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565598-6tfnn"] Mar 19 16:04:07 crc kubenswrapper[4771]: I0319 16:04:07.525368 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="303cb7b8-fa9b-4e41-856f-4205f162828e" path="/var/lib/kubelet/pods/303cb7b8-fa9b-4e41-856f-4205f162828e/volumes" Mar 19 16:04:08 crc kubenswrapper[4771]: I0319 16:04:08.509227 4771 scope.go:117] "RemoveContainer" containerID="a6abaaf08e503f1ee61302d3d79f5d8a6acef6453d41a9fac95abf163411a032" Mar 19 16:04:08 crc kubenswrapper[4771]: E0319 16:04:08.509624 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:04:15 crc kubenswrapper[4771]: I0319 16:04:15.508742 4771 scope.go:117] "RemoveContainer" containerID="2cb341bcaa4357e1a144f46d483a87bcf66466390a7b674cff57c3727c150855" Mar 19 16:04:15 crc kubenswrapper[4771]: E0319 16:04:15.509237 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:04:19 crc kubenswrapper[4771]: I0319 16:04:19.508809 4771 scope.go:117] "RemoveContainer" containerID="a6abaaf08e503f1ee61302d3d79f5d8a6acef6453d41a9fac95abf163411a032" Mar 19 16:04:19 crc kubenswrapper[4771]: E0319 16:04:19.509117 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:04:23 crc kubenswrapper[4771]: I0319 16:04:23.027848 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 16:04:23 crc kubenswrapper[4771]: I0319 16:04:23.028343 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 16:04:30 crc kubenswrapper[4771]: I0319 16:04:30.509739 4771 scope.go:117] "RemoveContainer" containerID="a6abaaf08e503f1ee61302d3d79f5d8a6acef6453d41a9fac95abf163411a032" Mar 19 16:04:30 crc kubenswrapper[4771]: I0319 16:04:30.509821 4771 scope.go:117] "RemoveContainer" containerID="2cb341bcaa4357e1a144f46d483a87bcf66466390a7b674cff57c3727c150855" Mar 19 16:04:30 crc kubenswrapper[4771]: E0319 16:04:30.510302 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:04:30 crc kubenswrapper[4771]: E0319 16:04:30.510374 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:04:43 crc kubenswrapper[4771]: I0319 16:04:43.509615 4771 scope.go:117] "RemoveContainer" containerID="2cb341bcaa4357e1a144f46d483a87bcf66466390a7b674cff57c3727c150855" Mar 19 16:04:43 crc kubenswrapper[4771]: E0319 16:04:43.510718 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:04:44 crc kubenswrapper[4771]: I0319 16:04:44.509079 4771 scope.go:117] "RemoveContainer" containerID="a6abaaf08e503f1ee61302d3d79f5d8a6acef6453d41a9fac95abf163411a032" Mar 19 16:04:44 crc kubenswrapper[4771]: E0319 16:04:44.509530 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:04:46 crc kubenswrapper[4771]: I0319 16:04:46.949293 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l9b2s"] Mar 19 16:04:46 crc kubenswrapper[4771]: E0319 16:04:46.949970 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6e2569e-154b-490c-8ed8-d0143bc882b8" containerName="oc" Mar 19 16:04:46 crc kubenswrapper[4771]: I0319 16:04:46.950003 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6e2569e-154b-490c-8ed8-d0143bc882b8" containerName="oc" Mar 19 16:04:46 crc kubenswrapper[4771]: I0319 16:04:46.950226 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6e2569e-154b-490c-8ed8-d0143bc882b8" containerName="oc" Mar 19 16:04:46 crc kubenswrapper[4771]: I0319 16:04:46.952896 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l9b2s" Mar 19 16:04:46 crc kubenswrapper[4771]: I0319 16:04:46.961758 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l9b2s"] Mar 19 16:04:47 crc kubenswrapper[4771]: I0319 16:04:47.137858 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6547838-a074-467a-9613-ddf2ef8417c1-catalog-content\") pod \"certified-operators-l9b2s\" (UID: \"e6547838-a074-467a-9613-ddf2ef8417c1\") " pod="openshift-marketplace/certified-operators-l9b2s" Mar 19 16:04:47 crc kubenswrapper[4771]: I0319 16:04:47.137913 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6547838-a074-467a-9613-ddf2ef8417c1-utilities\") pod \"certified-operators-l9b2s\" (UID: \"e6547838-a074-467a-9613-ddf2ef8417c1\") " pod="openshift-marketplace/certified-operators-l9b2s" Mar 19 16:04:47 crc kubenswrapper[4771]: I0319 16:04:47.137934 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csdqd\" (UniqueName: \"kubernetes.io/projected/e6547838-a074-467a-9613-ddf2ef8417c1-kube-api-access-csdqd\") pod \"certified-operators-l9b2s\" (UID: \"e6547838-a074-467a-9613-ddf2ef8417c1\") " pod="openshift-marketplace/certified-operators-l9b2s" Mar 19 16:04:47 crc kubenswrapper[4771]: I0319 16:04:47.239705 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6547838-a074-467a-9613-ddf2ef8417c1-catalog-content\") pod \"certified-operators-l9b2s\" (UID: \"e6547838-a074-467a-9613-ddf2ef8417c1\") " pod="openshift-marketplace/certified-operators-l9b2s" Mar 19 16:04:47 crc kubenswrapper[4771]: I0319 16:04:47.239800 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6547838-a074-467a-9613-ddf2ef8417c1-utilities\") pod \"certified-operators-l9b2s\" (UID: \"e6547838-a074-467a-9613-ddf2ef8417c1\") " pod="openshift-marketplace/certified-operators-l9b2s" Mar 19 16:04:47 crc kubenswrapper[4771]: I0319 16:04:47.239837 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csdqd\" (UniqueName: \"kubernetes.io/projected/e6547838-a074-467a-9613-ddf2ef8417c1-kube-api-access-csdqd\") pod \"certified-operators-l9b2s\" (UID: \"e6547838-a074-467a-9613-ddf2ef8417c1\") " pod="openshift-marketplace/certified-operators-l9b2s" Mar 19 16:04:47 crc kubenswrapper[4771]: I0319 16:04:47.240405 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6547838-a074-467a-9613-ddf2ef8417c1-utilities\") pod \"certified-operators-l9b2s\" (UID: \"e6547838-a074-467a-9613-ddf2ef8417c1\") " pod="openshift-marketplace/certified-operators-l9b2s" Mar 19 16:04:47 crc kubenswrapper[4771]: I0319 16:04:47.240413 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6547838-a074-467a-9613-ddf2ef8417c1-catalog-content\") pod \"certified-operators-l9b2s\" (UID: \"e6547838-a074-467a-9613-ddf2ef8417c1\") " pod="openshift-marketplace/certified-operators-l9b2s" Mar 19 16:04:47 crc kubenswrapper[4771]: I0319 16:04:47.264195 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csdqd\" (UniqueName: \"kubernetes.io/projected/e6547838-a074-467a-9613-ddf2ef8417c1-kube-api-access-csdqd\") pod \"certified-operators-l9b2s\" (UID: \"e6547838-a074-467a-9613-ddf2ef8417c1\") " pod="openshift-marketplace/certified-operators-l9b2s" Mar 19 16:04:47 crc kubenswrapper[4771]: I0319 16:04:47.285001 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l9b2s" Mar 19 16:04:47 crc kubenswrapper[4771]: I0319 16:04:47.573965 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l9b2s"] Mar 19 16:04:47 crc kubenswrapper[4771]: I0319 16:04:47.896071 4771 generic.go:334] "Generic (PLEG): container finished" podID="e6547838-a074-467a-9613-ddf2ef8417c1" containerID="2f858f3b61c88fb45a4c4bdd919e67c3109146c00f3b902b43ca20c1fd86163d" exitCode=0 Mar 19 16:04:47 crc kubenswrapper[4771]: I0319 16:04:47.896314 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l9b2s" event={"ID":"e6547838-a074-467a-9613-ddf2ef8417c1","Type":"ContainerDied","Data":"2f858f3b61c88fb45a4c4bdd919e67c3109146c00f3b902b43ca20c1fd86163d"} Mar 19 16:04:47 crc kubenswrapper[4771]: I0319 16:04:47.896432 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l9b2s" event={"ID":"e6547838-a074-467a-9613-ddf2ef8417c1","Type":"ContainerStarted","Data":"91982a9912de0d0eb3867736e3c339441cc66879645aedd915e24ea468590d43"} Mar 19 16:04:49 crc kubenswrapper[4771]: I0319 16:04:49.917917 4771 generic.go:334] "Generic (PLEG): container finished" podID="e6547838-a074-467a-9613-ddf2ef8417c1" containerID="e9a72136238625409ce249e32735904b2a1be683f3f06f79ee739c9f19669c40" exitCode=0 Mar 19 16:04:49 crc kubenswrapper[4771]: I0319 16:04:49.918031 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l9b2s" event={"ID":"e6547838-a074-467a-9613-ddf2ef8417c1","Type":"ContainerDied","Data":"e9a72136238625409ce249e32735904b2a1be683f3f06f79ee739c9f19669c40"} Mar 19 16:04:50 crc kubenswrapper[4771]: I0319 16:04:50.929445 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l9b2s" event={"ID":"e6547838-a074-467a-9613-ddf2ef8417c1","Type":"ContainerStarted","Data":"5827e1c68efc2a36bef9ccff5fc9c7995441e17885d26eb02122e82d83f1a2f7"} Mar 19 16:04:50 crc kubenswrapper[4771]: I0319 16:04:50.958866 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l9b2s" podStartSLOduration=2.311492608 podStartE2EDuration="4.958839063s" podCreationTimestamp="2026-03-19 16:04:46 +0000 UTC" firstStartedPulling="2026-03-19 16:04:47.89897382 +0000 UTC m=+2947.127595022" lastFinishedPulling="2026-03-19 16:04:50.546320265 +0000 UTC m=+2949.774941477" observedRunningTime="2026-03-19 16:04:50.952628712 +0000 UTC m=+2950.181249924" watchObservedRunningTime="2026-03-19 16:04:50.958839063 +0000 UTC m=+2950.187460305" Mar 19 16:04:53 crc kubenswrapper[4771]: I0319 16:04:53.027904 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 16:04:53 crc kubenswrapper[4771]: I0319 16:04:53.028384 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 16:04:55 crc kubenswrapper[4771]: I0319 16:04:55.509691 4771 scope.go:117] "RemoveContainer" containerID="2cb341bcaa4357e1a144f46d483a87bcf66466390a7b674cff57c3727c150855" Mar 19 16:04:55 crc kubenswrapper[4771]: E0319 16:04:55.510355 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:04:57 crc kubenswrapper[4771]: I0319 16:04:57.286392 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l9b2s" Mar 19 16:04:57 crc kubenswrapper[4771]: I0319 16:04:57.287220 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l9b2s" Mar 19 16:04:57 crc kubenswrapper[4771]: I0319 16:04:57.354660 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l9b2s" Mar 19 16:04:57 crc kubenswrapper[4771]: I0319 16:04:57.508708 4771 scope.go:117] "RemoveContainer" containerID="a6abaaf08e503f1ee61302d3d79f5d8a6acef6453d41a9fac95abf163411a032" Mar 19 16:04:57 crc kubenswrapper[4771]: E0319 16:04:57.509024 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:04:58 crc kubenswrapper[4771]: I0319 16:04:58.068711 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l9b2s" Mar 19 16:04:58 crc kubenswrapper[4771]: I0319 16:04:58.144644 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l9b2s"] Mar 19 16:04:59 crc kubenswrapper[4771]: I0319 16:04:59.377899 4771 scope.go:117] "RemoveContainer" containerID="7437276c07b68d9014ad1324fcf83136a2ceabc1d4f19b08ddf3e230ea9ea494" Mar 19 16:05:00 crc kubenswrapper[4771]: I0319 16:05:00.015522 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l9b2s" podUID="e6547838-a074-467a-9613-ddf2ef8417c1" containerName="registry-server" containerID="cri-o://5827e1c68efc2a36bef9ccff5fc9c7995441e17885d26eb02122e82d83f1a2f7" gracePeriod=2 Mar 19 16:05:01 crc kubenswrapper[4771]: I0319 16:05:01.000556 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l9b2s" Mar 19 16:05:01 crc kubenswrapper[4771]: I0319 16:05:01.079699 4771 generic.go:334] "Generic (PLEG): container finished" podID="e6547838-a074-467a-9613-ddf2ef8417c1" containerID="5827e1c68efc2a36bef9ccff5fc9c7995441e17885d26eb02122e82d83f1a2f7" exitCode=0 Mar 19 16:05:01 crc kubenswrapper[4771]: I0319 16:05:01.079752 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l9b2s" event={"ID":"e6547838-a074-467a-9613-ddf2ef8417c1","Type":"ContainerDied","Data":"5827e1c68efc2a36bef9ccff5fc9c7995441e17885d26eb02122e82d83f1a2f7"} Mar 19 16:05:01 crc kubenswrapper[4771]: I0319 16:05:01.079790 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l9b2s" event={"ID":"e6547838-a074-467a-9613-ddf2ef8417c1","Type":"ContainerDied","Data":"91982a9912de0d0eb3867736e3c339441cc66879645aedd915e24ea468590d43"} Mar 19 16:05:01 crc kubenswrapper[4771]: I0319 16:05:01.079815 4771 scope.go:117] "RemoveContainer" containerID="5827e1c68efc2a36bef9ccff5fc9c7995441e17885d26eb02122e82d83f1a2f7" Mar 19 16:05:01 crc kubenswrapper[4771]: I0319 16:05:01.080022 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l9b2s" Mar 19 16:05:01 crc kubenswrapper[4771]: I0319 16:05:01.099118 4771 scope.go:117] "RemoveContainer" containerID="e9a72136238625409ce249e32735904b2a1be683f3f06f79ee739c9f19669c40" Mar 19 16:05:01 crc kubenswrapper[4771]: I0319 16:05:01.118035 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6547838-a074-467a-9613-ddf2ef8417c1-catalog-content\") pod \"e6547838-a074-467a-9613-ddf2ef8417c1\" (UID: \"e6547838-a074-467a-9613-ddf2ef8417c1\") " Mar 19 16:05:01 crc kubenswrapper[4771]: I0319 16:05:01.118079 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csdqd\" (UniqueName: \"kubernetes.io/projected/e6547838-a074-467a-9613-ddf2ef8417c1-kube-api-access-csdqd\") pod \"e6547838-a074-467a-9613-ddf2ef8417c1\" (UID: \"e6547838-a074-467a-9613-ddf2ef8417c1\") " Mar 19 16:05:01 crc kubenswrapper[4771]: I0319 16:05:01.118189 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6547838-a074-467a-9613-ddf2ef8417c1-utilities\") pod \"e6547838-a074-467a-9613-ddf2ef8417c1\" (UID: \"e6547838-a074-467a-9613-ddf2ef8417c1\") " Mar 19 16:05:01 crc kubenswrapper[4771]: I0319 16:05:01.119316 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6547838-a074-467a-9613-ddf2ef8417c1-utilities" (OuterVolumeSpecName: "utilities") pod "e6547838-a074-467a-9613-ddf2ef8417c1" (UID: "e6547838-a074-467a-9613-ddf2ef8417c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:05:01 crc kubenswrapper[4771]: I0319 16:05:01.128192 4771 scope.go:117] "RemoveContainer" containerID="2f858f3b61c88fb45a4c4bdd919e67c3109146c00f3b902b43ca20c1fd86163d" Mar 19 16:05:01 crc kubenswrapper[4771]: I0319 16:05:01.134103 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6547838-a074-467a-9613-ddf2ef8417c1-kube-api-access-csdqd" (OuterVolumeSpecName: "kube-api-access-csdqd") pod "e6547838-a074-467a-9613-ddf2ef8417c1" (UID: "e6547838-a074-467a-9613-ddf2ef8417c1"). InnerVolumeSpecName "kube-api-access-csdqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:05:01 crc kubenswrapper[4771]: I0319 16:05:01.174873 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6547838-a074-467a-9613-ddf2ef8417c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6547838-a074-467a-9613-ddf2ef8417c1" (UID: "e6547838-a074-467a-9613-ddf2ef8417c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:05:01 crc kubenswrapper[4771]: I0319 16:05:01.175184 4771 scope.go:117] "RemoveContainer" containerID="5827e1c68efc2a36bef9ccff5fc9c7995441e17885d26eb02122e82d83f1a2f7" Mar 19 16:05:01 crc kubenswrapper[4771]: E0319 16:05:01.175604 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5827e1c68efc2a36bef9ccff5fc9c7995441e17885d26eb02122e82d83f1a2f7\": container with ID starting with 5827e1c68efc2a36bef9ccff5fc9c7995441e17885d26eb02122e82d83f1a2f7 not found: ID does not exist" containerID="5827e1c68efc2a36bef9ccff5fc9c7995441e17885d26eb02122e82d83f1a2f7" Mar 19 16:05:01 crc kubenswrapper[4771]: I0319 16:05:01.175640 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5827e1c68efc2a36bef9ccff5fc9c7995441e17885d26eb02122e82d83f1a2f7"} err="failed to get container status \"5827e1c68efc2a36bef9ccff5fc9c7995441e17885d26eb02122e82d83f1a2f7\": rpc error: code = NotFound desc = could not find container \"5827e1c68efc2a36bef9ccff5fc9c7995441e17885d26eb02122e82d83f1a2f7\": container with ID starting with 5827e1c68efc2a36bef9ccff5fc9c7995441e17885d26eb02122e82d83f1a2f7 not found: ID does not exist" Mar 19 16:05:01 crc kubenswrapper[4771]: I0319 16:05:01.175659 4771 scope.go:117] "RemoveContainer" containerID="e9a72136238625409ce249e32735904b2a1be683f3f06f79ee739c9f19669c40" Mar 19 16:05:01 crc kubenswrapper[4771]: E0319 16:05:01.175933 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9a72136238625409ce249e32735904b2a1be683f3f06f79ee739c9f19669c40\": container with ID starting with e9a72136238625409ce249e32735904b2a1be683f3f06f79ee739c9f19669c40 not found: ID does not exist" containerID="e9a72136238625409ce249e32735904b2a1be683f3f06f79ee739c9f19669c40" Mar 19 16:05:01 crc kubenswrapper[4771]: I0319 16:05:01.175954 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9a72136238625409ce249e32735904b2a1be683f3f06f79ee739c9f19669c40"} err="failed to get container status \"e9a72136238625409ce249e32735904b2a1be683f3f06f79ee739c9f19669c40\": rpc error: code = NotFound desc = could not find container \"e9a72136238625409ce249e32735904b2a1be683f3f06f79ee739c9f19669c40\": container with ID starting with e9a72136238625409ce249e32735904b2a1be683f3f06f79ee739c9f19669c40 not found: ID does not exist" Mar 19 16:05:01 crc kubenswrapper[4771]: I0319 16:05:01.175967 4771 scope.go:117] "RemoveContainer" containerID="2f858f3b61c88fb45a4c4bdd919e67c3109146c00f3b902b43ca20c1fd86163d" Mar 19 16:05:01 crc kubenswrapper[4771]: E0319 16:05:01.176391 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f858f3b61c88fb45a4c4bdd919e67c3109146c00f3b902b43ca20c1fd86163d\": container with ID starting with 2f858f3b61c88fb45a4c4bdd919e67c3109146c00f3b902b43ca20c1fd86163d not found: ID does not exist" containerID="2f858f3b61c88fb45a4c4bdd919e67c3109146c00f3b902b43ca20c1fd86163d" Mar 19 16:05:01 crc kubenswrapper[4771]: I0319 16:05:01.176418 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f858f3b61c88fb45a4c4bdd919e67c3109146c00f3b902b43ca20c1fd86163d"} err="failed to get container status \"2f858f3b61c88fb45a4c4bdd919e67c3109146c00f3b902b43ca20c1fd86163d\": rpc error: code = NotFound desc = could not find container \"2f858f3b61c88fb45a4c4bdd919e67c3109146c00f3b902b43ca20c1fd86163d\": container with ID starting with 2f858f3b61c88fb45a4c4bdd919e67c3109146c00f3b902b43ca20c1fd86163d not found: ID does not exist" Mar 19 16:05:01 crc kubenswrapper[4771]: I0319 16:05:01.221083 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6547838-a074-467a-9613-ddf2ef8417c1-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 16:05:01 crc kubenswrapper[4771]: I0319 16:05:01.221153 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6547838-a074-467a-9613-ddf2ef8417c1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 16:05:01 crc kubenswrapper[4771]: I0319 16:05:01.221175 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csdqd\" (UniqueName: \"kubernetes.io/projected/e6547838-a074-467a-9613-ddf2ef8417c1-kube-api-access-csdqd\") on node \"crc\" DevicePath \"\"" Mar 19 16:05:01 crc kubenswrapper[4771]: I0319 16:05:01.427319 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l9b2s"] Mar 19 16:05:01 crc kubenswrapper[4771]: I0319 16:05:01.438612 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l9b2s"] Mar 19 16:05:01 crc kubenswrapper[4771]: I0319 16:05:01.523235 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6547838-a074-467a-9613-ddf2ef8417c1" path="/var/lib/kubelet/pods/e6547838-a074-467a-9613-ddf2ef8417c1/volumes" Mar 19 16:05:08 crc kubenswrapper[4771]: I0319 16:05:08.509304 4771 scope.go:117] "RemoveContainer" containerID="2cb341bcaa4357e1a144f46d483a87bcf66466390a7b674cff57c3727c150855" Mar 19 16:05:08 crc kubenswrapper[4771]: E0319 16:05:08.510235 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:05:10 crc kubenswrapper[4771]: I0319 16:05:10.509387 4771 scope.go:117] "RemoveContainer" containerID="a6abaaf08e503f1ee61302d3d79f5d8a6acef6453d41a9fac95abf163411a032" Mar 19 16:05:10 crc kubenswrapper[4771]: E0319 16:05:10.510272 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:05:20 crc kubenswrapper[4771]: I0319 16:05:20.749585 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vxq2j/must-gather-hmrzh"] Mar 19 16:05:20 crc kubenswrapper[4771]: E0319 16:05:20.751621 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6547838-a074-467a-9613-ddf2ef8417c1" containerName="extract-utilities" Mar 19 16:05:20 crc kubenswrapper[4771]: I0319 16:05:20.751720 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6547838-a074-467a-9613-ddf2ef8417c1" containerName="extract-utilities" Mar 19 16:05:20 crc kubenswrapper[4771]: E0319 16:05:20.751816 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6547838-a074-467a-9613-ddf2ef8417c1" containerName="registry-server" Mar 19 16:05:20 crc kubenswrapper[4771]: I0319 16:05:20.751899 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6547838-a074-467a-9613-ddf2ef8417c1" containerName="registry-server" Mar 19 16:05:20 crc kubenswrapper[4771]: E0319 16:05:20.751983 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6547838-a074-467a-9613-ddf2ef8417c1" containerName="extract-content" Mar 19 16:05:20 crc kubenswrapper[4771]: I0319 16:05:20.752118 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6547838-a074-467a-9613-ddf2ef8417c1" containerName="extract-content" Mar 19 16:05:20 crc kubenswrapper[4771]: I0319 16:05:20.752394 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6547838-a074-467a-9613-ddf2ef8417c1" containerName="registry-server" Mar 19 16:05:20 crc kubenswrapper[4771]: I0319 16:05:20.753551 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vxq2j/must-gather-hmrzh" Mar 19 16:05:20 crc kubenswrapper[4771]: I0319 16:05:20.756814 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vxq2j"/"kube-root-ca.crt" Mar 19 16:05:20 crc kubenswrapper[4771]: I0319 16:05:20.757185 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vxq2j"/"openshift-service-ca.crt" Mar 19 16:05:20 crc kubenswrapper[4771]: I0319 16:05:20.786424 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vxq2j/must-gather-hmrzh"] Mar 19 16:05:20 crc kubenswrapper[4771]: I0319 16:05:20.891249 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4d98595c-7966-42e7-b19c-ed21464e0c22-must-gather-output\") pod \"must-gather-hmrzh\" (UID: \"4d98595c-7966-42e7-b19c-ed21464e0c22\") " pod="openshift-must-gather-vxq2j/must-gather-hmrzh" Mar 19 16:05:20 crc kubenswrapper[4771]: I0319 16:05:20.891534 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjmfw\" (UniqueName: \"kubernetes.io/projected/4d98595c-7966-42e7-b19c-ed21464e0c22-kube-api-access-xjmfw\") pod \"must-gather-hmrzh\" (UID: \"4d98595c-7966-42e7-b19c-ed21464e0c22\") " pod="openshift-must-gather-vxq2j/must-gather-hmrzh" Mar 19 16:05:20 crc kubenswrapper[4771]: I0319 16:05:20.993489 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4d98595c-7966-42e7-b19c-ed21464e0c22-must-gather-output\") pod \"must-gather-hmrzh\" (UID: \"4d98595c-7966-42e7-b19c-ed21464e0c22\") " pod="openshift-must-gather-vxq2j/must-gather-hmrzh" Mar 19 16:05:20 crc kubenswrapper[4771]: I0319 16:05:20.994197 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjmfw\" (UniqueName: \"kubernetes.io/projected/4d98595c-7966-42e7-b19c-ed21464e0c22-kube-api-access-xjmfw\") pod \"must-gather-hmrzh\" (UID: \"4d98595c-7966-42e7-b19c-ed21464e0c22\") " pod="openshift-must-gather-vxq2j/must-gather-hmrzh" Mar 19 16:05:20 crc kubenswrapper[4771]: I0319 16:05:20.994158 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4d98595c-7966-42e7-b19c-ed21464e0c22-must-gather-output\") pod \"must-gather-hmrzh\" (UID: \"4d98595c-7966-42e7-b19c-ed21464e0c22\") " pod="openshift-must-gather-vxq2j/must-gather-hmrzh" Mar 19 16:05:21 crc kubenswrapper[4771]: I0319 16:05:21.018974 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjmfw\" (UniqueName: \"kubernetes.io/projected/4d98595c-7966-42e7-b19c-ed21464e0c22-kube-api-access-xjmfw\") pod \"must-gather-hmrzh\" (UID: \"4d98595c-7966-42e7-b19c-ed21464e0c22\") " pod="openshift-must-gather-vxq2j/must-gather-hmrzh" Mar 19 16:05:21 crc kubenswrapper[4771]: I0319 16:05:21.072429 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vxq2j/must-gather-hmrzh" Mar 19 16:05:21 crc kubenswrapper[4771]: I0319 16:05:21.521445 4771 scope.go:117] "RemoveContainer" containerID="a6abaaf08e503f1ee61302d3d79f5d8a6acef6453d41a9fac95abf163411a032" Mar 19 16:05:21 crc kubenswrapper[4771]: I0319 16:05:21.522425 4771 scope.go:117] "RemoveContainer" containerID="2cb341bcaa4357e1a144f46d483a87bcf66466390a7b674cff57c3727c150855" Mar 19 16:05:21 crc kubenswrapper[4771]: E0319 16:05:21.522465 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:05:21 crc kubenswrapper[4771]: E0319 16:05:21.522672 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:05:21 crc kubenswrapper[4771]: I0319 16:05:21.577090 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vxq2j/must-gather-hmrzh"] Mar 19 16:05:22 crc kubenswrapper[4771]: I0319 16:05:22.288569 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vxq2j/must-gather-hmrzh" event={"ID":"4d98595c-7966-42e7-b19c-ed21464e0c22","Type":"ContainerStarted","Data":"0c7b44212c2910a44ec685ef2a46086e66ed75eae597d8a98a4b78577d4d953e"} Mar 19 16:05:23 crc kubenswrapper[4771]: I0319 16:05:23.027613 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 16:05:23 crc kubenswrapper[4771]: I0319 16:05:23.028002 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 16:05:23 crc kubenswrapper[4771]: I0319 16:05:23.028056 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" Mar 19 16:05:23 crc kubenswrapper[4771]: I0319 16:05:23.029206 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"968b4f95c0b111096835bcde7002c8b81ceeae3af6b39b2a7dae5f7403f2ce35"} pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 16:05:23 crc kubenswrapper[4771]: I0319 16:05:23.029323 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" containerID="cri-o://968b4f95c0b111096835bcde7002c8b81ceeae3af6b39b2a7dae5f7403f2ce35" gracePeriod=600 Mar 19 16:05:23 crc kubenswrapper[4771]: E0319 16:05:23.147022 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 16:05:23 crc kubenswrapper[4771]: I0319 16:05:23.298947 4771 generic.go:334] "Generic (PLEG): container finished" podID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerID="968b4f95c0b111096835bcde7002c8b81ceeae3af6b39b2a7dae5f7403f2ce35" exitCode=0 Mar 19 16:05:23 crc kubenswrapper[4771]: I0319 16:05:23.299041 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" event={"ID":"f2b6e948-bbef-4217-b0eb-4cdbf711037c","Type":"ContainerDied","Data":"968b4f95c0b111096835bcde7002c8b81ceeae3af6b39b2a7dae5f7403f2ce35"} Mar 19 16:05:23 crc kubenswrapper[4771]: I0319 16:05:23.299087 4771 scope.go:117] "RemoveContainer" containerID="ac106972bbc38acc7af7f4fb7db71c92708ab1cefb8d1a1910e914ead40dfa5f" Mar 19 16:05:23 crc kubenswrapper[4771]: I0319 16:05:23.299803 4771 scope.go:117] "RemoveContainer" containerID="968b4f95c0b111096835bcde7002c8b81ceeae3af6b39b2a7dae5f7403f2ce35" Mar 19 16:05:23 crc kubenswrapper[4771]: E0319 16:05:23.300194 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 16:05:28 crc kubenswrapper[4771]: I0319 16:05:28.340943 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vxq2j/must-gather-hmrzh" event={"ID":"4d98595c-7966-42e7-b19c-ed21464e0c22","Type":"ContainerStarted","Data":"cdb3afcc1e2553ec6b9e132532eac6bb9f327d1686d277b5d685f6f26506c565"} Mar 19 16:05:28 crc kubenswrapper[4771]: I0319 16:05:28.341485 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vxq2j/must-gather-hmrzh" event={"ID":"4d98595c-7966-42e7-b19c-ed21464e0c22","Type":"ContainerStarted","Data":"4133e26663e4e53f216eb5cd0730fdd46d6b2d7447530c82a6264fbcbcdd7886"} Mar 19 16:05:28 crc kubenswrapper[4771]: I0319 16:05:28.358449 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vxq2j/must-gather-hmrzh" podStartSLOduration=2.3815217349999998 podStartE2EDuration="8.358431853s" podCreationTimestamp="2026-03-19 16:05:20 +0000 UTC" firstStartedPulling="2026-03-19 16:05:21.586582432 +0000 UTC m=+2980.815203664" lastFinishedPulling="2026-03-19 16:05:27.56349257 +0000 UTC m=+2986.792113782" observedRunningTime="2026-03-19 16:05:28.352624622 +0000 UTC m=+2987.581245824" watchObservedRunningTime="2026-03-19 16:05:28.358431853 +0000 UTC m=+2987.587053055" Mar 19 16:05:28 crc kubenswrapper[4771]: I0319 16:05:28.442877 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vxq2j/crc-debug-jtr4t"] Mar 19 16:05:28 crc kubenswrapper[4771]: I0319 16:05:28.443876 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vxq2j/crc-debug-jtr4t" Mar 19 16:05:28 crc kubenswrapper[4771]: I0319 16:05:28.445875 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-vxq2j"/"default-dockercfg-sj7jj" Mar 19 16:05:28 crc kubenswrapper[4771]: I0319 16:05:28.525826 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx6sm\" (UniqueName: \"kubernetes.io/projected/e42a9a14-8149-4b82-8530-3e770447ecf9-kube-api-access-zx6sm\") pod \"crc-debug-jtr4t\" (UID: \"e42a9a14-8149-4b82-8530-3e770447ecf9\") " pod="openshift-must-gather-vxq2j/crc-debug-jtr4t" Mar 19 16:05:28 crc kubenswrapper[4771]: I0319 16:05:28.526194 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e42a9a14-8149-4b82-8530-3e770447ecf9-host\") pod \"crc-debug-jtr4t\" (UID: \"e42a9a14-8149-4b82-8530-3e770447ecf9\") " pod="openshift-must-gather-vxq2j/crc-debug-jtr4t" Mar 19 16:05:28 crc kubenswrapper[4771]: I0319 16:05:28.627804 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx6sm\" (UniqueName: \"kubernetes.io/projected/e42a9a14-8149-4b82-8530-3e770447ecf9-kube-api-access-zx6sm\") pod \"crc-debug-jtr4t\" (UID: \"e42a9a14-8149-4b82-8530-3e770447ecf9\") " pod="openshift-must-gather-vxq2j/crc-debug-jtr4t" Mar 19 16:05:28 crc kubenswrapper[4771]: I0319 16:05:28.627917 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e42a9a14-8149-4b82-8530-3e770447ecf9-host\") pod \"crc-debug-jtr4t\" (UID: \"e42a9a14-8149-4b82-8530-3e770447ecf9\") " pod="openshift-must-gather-vxq2j/crc-debug-jtr4t" Mar 19 16:05:28 crc kubenswrapper[4771]: I0319 16:05:28.628089 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e42a9a14-8149-4b82-8530-3e770447ecf9-host\") pod \"crc-debug-jtr4t\" (UID: \"e42a9a14-8149-4b82-8530-3e770447ecf9\") " pod="openshift-must-gather-vxq2j/crc-debug-jtr4t" Mar 19 16:05:28 crc kubenswrapper[4771]: I0319 16:05:28.661705 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx6sm\" (UniqueName: \"kubernetes.io/projected/e42a9a14-8149-4b82-8530-3e770447ecf9-kube-api-access-zx6sm\") pod \"crc-debug-jtr4t\" (UID: \"e42a9a14-8149-4b82-8530-3e770447ecf9\") " pod="openshift-must-gather-vxq2j/crc-debug-jtr4t" Mar 19 16:05:28 crc kubenswrapper[4771]: I0319 16:05:28.757859 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vxq2j/crc-debug-jtr4t" Mar 19 16:05:28 crc kubenswrapper[4771]: W0319 16:05:28.788216 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode42a9a14_8149_4b82_8530_3e770447ecf9.slice/crio-fa0e249da3160ff8c9e078d643f3f910d733fdae31c171b623fee0bab7077f5e WatchSource:0}: Error finding container fa0e249da3160ff8c9e078d643f3f910d733fdae31c171b623fee0bab7077f5e: Status 404 returned error can't find the container with id fa0e249da3160ff8c9e078d643f3f910d733fdae31c171b623fee0bab7077f5e Mar 19 16:05:29 crc kubenswrapper[4771]: I0319 16:05:29.359720 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vxq2j/crc-debug-jtr4t" event={"ID":"e42a9a14-8149-4b82-8530-3e770447ecf9","Type":"ContainerStarted","Data":"fa0e249da3160ff8c9e078d643f3f910d733fdae31c171b623fee0bab7077f5e"} Mar 19 16:05:34 crc kubenswrapper[4771]: I0319 16:05:34.508726 4771 scope.go:117] "RemoveContainer" containerID="2cb341bcaa4357e1a144f46d483a87bcf66466390a7b674cff57c3727c150855" Mar 19 16:05:34 crc kubenswrapper[4771]: E0319 16:05:34.509453 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:05:35 crc kubenswrapper[4771]: I0319 16:05:35.508763 4771 scope.go:117] "RemoveContainer" containerID="968b4f95c0b111096835bcde7002c8b81ceeae3af6b39b2a7dae5f7403f2ce35" Mar 19 16:05:35 crc kubenswrapper[4771]: E0319 16:05:35.509062 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 16:05:35 crc kubenswrapper[4771]: I0319 16:05:35.509041 4771 scope.go:117] "RemoveContainer" containerID="a6abaaf08e503f1ee61302d3d79f5d8a6acef6453d41a9fac95abf163411a032" Mar 19 16:05:35 crc kubenswrapper[4771]: E0319 16:05:35.509407 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:05:40 crc kubenswrapper[4771]: I0319 16:05:40.450251 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vxq2j/crc-debug-jtr4t" event={"ID":"e42a9a14-8149-4b82-8530-3e770447ecf9","Type":"ContainerStarted","Data":"1a27177ece82ade292d61589b58a98dec0d5d2382505a91e192809e068867c0d"} Mar 19 16:05:40 crc kubenswrapper[4771]: I0319 16:05:40.474403 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vxq2j/crc-debug-jtr4t" podStartSLOduration=1.176122112 podStartE2EDuration="12.47438783s" podCreationTimestamp="2026-03-19 16:05:28 +0000 UTC" firstStartedPulling="2026-03-19 16:05:28.79137356 +0000 UTC m=+2988.019994762" lastFinishedPulling="2026-03-19 16:05:40.089639278 +0000 UTC m=+2999.318260480" observedRunningTime="2026-03-19 16:05:40.467653746 +0000 UTC m=+2999.696274938" watchObservedRunningTime="2026-03-19 16:05:40.47438783 +0000 UTC m=+2999.703009022" Mar 19 16:05:46 crc kubenswrapper[4771]: I0319 16:05:46.509803 4771 scope.go:117] "RemoveContainer" containerID="2cb341bcaa4357e1a144f46d483a87bcf66466390a7b674cff57c3727c150855" Mar 19 16:05:46 crc kubenswrapper[4771]: E0319 16:05:46.510615 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:05:49 crc kubenswrapper[4771]: I0319 16:05:49.508744 4771 scope.go:117] "RemoveContainer" containerID="968b4f95c0b111096835bcde7002c8b81ceeae3af6b39b2a7dae5f7403f2ce35" Mar 19 16:05:49 crc kubenswrapper[4771]: I0319 16:05:49.509206 4771 scope.go:117] "RemoveContainer" containerID="a6abaaf08e503f1ee61302d3d79f5d8a6acef6453d41a9fac95abf163411a032" Mar 19 16:05:49 crc kubenswrapper[4771]: E0319 16:05:49.509445 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:05:49 crc kubenswrapper[4771]: E0319 16:05:49.509534 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 16:05:54 crc kubenswrapper[4771]: I0319 16:05:54.550463 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c9jtj"] Mar 19 16:05:54 crc kubenswrapper[4771]: I0319 16:05:54.552638 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c9jtj" Mar 19 16:05:54 crc kubenswrapper[4771]: I0319 16:05:54.563000 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c9jtj"] Mar 19 16:05:54 crc kubenswrapper[4771]: I0319 16:05:54.715977 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec3538b3-4651-4029-9e15-c151201f08a0-catalog-content\") pod \"redhat-operators-c9jtj\" (UID: \"ec3538b3-4651-4029-9e15-c151201f08a0\") " pod="openshift-marketplace/redhat-operators-c9jtj" Mar 19 16:05:54 crc kubenswrapper[4771]: I0319 16:05:54.716076 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7djv5\" (UniqueName: \"kubernetes.io/projected/ec3538b3-4651-4029-9e15-c151201f08a0-kube-api-access-7djv5\") pod \"redhat-operators-c9jtj\" (UID: \"ec3538b3-4651-4029-9e15-c151201f08a0\") " pod="openshift-marketplace/redhat-operators-c9jtj" Mar 19 16:05:54 crc kubenswrapper[4771]: I0319 16:05:54.716853 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec3538b3-4651-4029-9e15-c151201f08a0-utilities\") pod \"redhat-operators-c9jtj\" (UID: \"ec3538b3-4651-4029-9e15-c151201f08a0\") " pod="openshift-marketplace/redhat-operators-c9jtj" Mar 19 16:05:54 crc kubenswrapper[4771]: I0319 16:05:54.818343 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec3538b3-4651-4029-9e15-c151201f08a0-utilities\") pod \"redhat-operators-c9jtj\" (UID: \"ec3538b3-4651-4029-9e15-c151201f08a0\") " pod="openshift-marketplace/redhat-operators-c9jtj" Mar 19 16:05:54 crc kubenswrapper[4771]: I0319 16:05:54.818806 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec3538b3-4651-4029-9e15-c151201f08a0-utilities\") pod \"redhat-operators-c9jtj\" (UID: \"ec3538b3-4651-4029-9e15-c151201f08a0\") " pod="openshift-marketplace/redhat-operators-c9jtj" Mar 19 16:05:54 crc kubenswrapper[4771]: I0319 16:05:54.819150 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec3538b3-4651-4029-9e15-c151201f08a0-catalog-content\") pod \"redhat-operators-c9jtj\" (UID: \"ec3538b3-4651-4029-9e15-c151201f08a0\") " pod="openshift-marketplace/redhat-operators-c9jtj" Mar 19 16:05:54 crc kubenswrapper[4771]: I0319 16:05:54.819293 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7djv5\" (UniqueName: \"kubernetes.io/projected/ec3538b3-4651-4029-9e15-c151201f08a0-kube-api-access-7djv5\") pod \"redhat-operators-c9jtj\" (UID: \"ec3538b3-4651-4029-9e15-c151201f08a0\") " pod="openshift-marketplace/redhat-operators-c9jtj" Mar 19 16:05:54 crc kubenswrapper[4771]: I0319 16:05:54.819416 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec3538b3-4651-4029-9e15-c151201f08a0-catalog-content\") pod \"redhat-operators-c9jtj\" (UID: \"ec3538b3-4651-4029-9e15-c151201f08a0\") " pod="openshift-marketplace/redhat-operators-c9jtj" Mar 19 16:05:54 crc kubenswrapper[4771]: I0319 16:05:54.839519 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7djv5\" (UniqueName: \"kubernetes.io/projected/ec3538b3-4651-4029-9e15-c151201f08a0-kube-api-access-7djv5\") pod \"redhat-operators-c9jtj\" (UID: \"ec3538b3-4651-4029-9e15-c151201f08a0\") " pod="openshift-marketplace/redhat-operators-c9jtj" Mar 19 16:05:54 crc kubenswrapper[4771]: I0319 16:05:54.945905 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c9jtj" Mar 19 16:05:55 crc kubenswrapper[4771]: I0319 16:05:55.419700 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c9jtj"] Mar 19 16:05:55 crc kubenswrapper[4771]: I0319 16:05:55.578147 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9jtj" event={"ID":"ec3538b3-4651-4029-9e15-c151201f08a0","Type":"ContainerStarted","Data":"c2550239e7974d8f61a9e0014407228e1ccbc56abecc61554dc0bc0ddcf1ab0c"} Mar 19 16:05:57 crc kubenswrapper[4771]: I0319 16:05:57.594695 4771 generic.go:334] "Generic (PLEG): container finished" podID="ec3538b3-4651-4029-9e15-c151201f08a0" containerID="906e1b3ca9d80145d5fb2fdf206004450ee6e099826f1ad8881db6bbe703d196" exitCode=0 Mar 19 16:05:57 crc kubenswrapper[4771]: I0319 16:05:57.594789 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9jtj" event={"ID":"ec3538b3-4651-4029-9e15-c151201f08a0","Type":"ContainerDied","Data":"906e1b3ca9d80145d5fb2fdf206004450ee6e099826f1ad8881db6bbe703d196"} Mar 19 16:05:59 crc kubenswrapper[4771]: I0319 16:05:59.613794 4771 generic.go:334] "Generic (PLEG): container finished" podID="e42a9a14-8149-4b82-8530-3e770447ecf9" containerID="1a27177ece82ade292d61589b58a98dec0d5d2382505a91e192809e068867c0d" exitCode=0 Mar 19 16:05:59 crc kubenswrapper[4771]: I0319 16:05:59.613842 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vxq2j/crc-debug-jtr4t" event={"ID":"e42a9a14-8149-4b82-8530-3e770447ecf9","Type":"ContainerDied","Data":"1a27177ece82ade292d61589b58a98dec0d5d2382505a91e192809e068867c0d"} Mar 19 16:06:00 crc kubenswrapper[4771]: I0319 16:06:00.136932 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565606-rpgg5"] Mar 19 16:06:00 crc kubenswrapper[4771]: I0319 16:06:00.138132 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565606-rpgg5" Mar 19 16:06:00 crc kubenswrapper[4771]: I0319 16:06:00.141107 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 16:06:00 crc kubenswrapper[4771]: I0319 16:06:00.142662 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 16:06:00 crc kubenswrapper[4771]: I0319 16:06:00.142693 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k42k7" Mar 19 16:06:00 crc kubenswrapper[4771]: I0319 16:06:00.149479 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565606-rpgg5"] Mar 19 16:06:00 crc kubenswrapper[4771]: I0319 16:06:00.308089 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g78m\" (UniqueName: \"kubernetes.io/projected/ef84e2f9-ca97-4b5e-adaa-c8d337e7b0f1-kube-api-access-8g78m\") pod \"auto-csr-approver-29565606-rpgg5\" (UID: \"ef84e2f9-ca97-4b5e-adaa-c8d337e7b0f1\") " pod="openshift-infra/auto-csr-approver-29565606-rpgg5" Mar 19 16:06:00 crc kubenswrapper[4771]: I0319 16:06:00.409122 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g78m\" (UniqueName: \"kubernetes.io/projected/ef84e2f9-ca97-4b5e-adaa-c8d337e7b0f1-kube-api-access-8g78m\") pod \"auto-csr-approver-29565606-rpgg5\" (UID: \"ef84e2f9-ca97-4b5e-adaa-c8d337e7b0f1\") " pod="openshift-infra/auto-csr-approver-29565606-rpgg5" Mar 19 16:06:00 crc kubenswrapper[4771]: I0319 16:06:00.433478 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g78m\" (UniqueName: \"kubernetes.io/projected/ef84e2f9-ca97-4b5e-adaa-c8d337e7b0f1-kube-api-access-8g78m\") pod \"auto-csr-approver-29565606-rpgg5\" (UID: \"ef84e2f9-ca97-4b5e-adaa-c8d337e7b0f1\") " pod="openshift-infra/auto-csr-approver-29565606-rpgg5" Mar 19 16:06:00 crc kubenswrapper[4771]: I0319 16:06:00.459875 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565606-rpgg5" Mar 19 16:06:00 crc kubenswrapper[4771]: I0319 16:06:00.509314 4771 scope.go:117] "RemoveContainer" containerID="2cb341bcaa4357e1a144f46d483a87bcf66466390a7b674cff57c3727c150855" Mar 19 16:06:00 crc kubenswrapper[4771]: I0319 16:06:00.509487 4771 scope.go:117] "RemoveContainer" containerID="a6abaaf08e503f1ee61302d3d79f5d8a6acef6453d41a9fac95abf163411a032" Mar 19 16:06:00 crc kubenswrapper[4771]: E0319 16:06:00.509517 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:06:00 crc kubenswrapper[4771]: E0319 16:06:00.509846 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:06:00 crc kubenswrapper[4771]: I0319 16:06:00.715350 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vxq2j/crc-debug-jtr4t" Mar 19 16:06:00 crc kubenswrapper[4771]: I0319 16:06:00.745767 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vxq2j/crc-debug-jtr4t"] Mar 19 16:06:00 crc kubenswrapper[4771]: I0319 16:06:00.754106 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vxq2j/crc-debug-jtr4t"] Mar 19 16:06:00 crc kubenswrapper[4771]: I0319 16:06:00.821014 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zx6sm\" (UniqueName: \"kubernetes.io/projected/e42a9a14-8149-4b82-8530-3e770447ecf9-kube-api-access-zx6sm\") pod \"e42a9a14-8149-4b82-8530-3e770447ecf9\" (UID: \"e42a9a14-8149-4b82-8530-3e770447ecf9\") " Mar 19 16:06:00 crc kubenswrapper[4771]: I0319 16:06:00.821262 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e42a9a14-8149-4b82-8530-3e770447ecf9-host\") pod \"e42a9a14-8149-4b82-8530-3e770447ecf9\" (UID: \"e42a9a14-8149-4b82-8530-3e770447ecf9\") " Mar 19 16:06:00 crc kubenswrapper[4771]: I0319 16:06:00.821336 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e42a9a14-8149-4b82-8530-3e770447ecf9-host" (OuterVolumeSpecName: "host") pod "e42a9a14-8149-4b82-8530-3e770447ecf9" (UID: "e42a9a14-8149-4b82-8530-3e770447ecf9"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:06:00 crc kubenswrapper[4771]: I0319 16:06:00.821831 4771 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e42a9a14-8149-4b82-8530-3e770447ecf9-host\") on node \"crc\" DevicePath \"\"" Mar 19 16:06:00 crc kubenswrapper[4771]: I0319 16:06:00.826785 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e42a9a14-8149-4b82-8530-3e770447ecf9-kube-api-access-zx6sm" (OuterVolumeSpecName: "kube-api-access-zx6sm") pod "e42a9a14-8149-4b82-8530-3e770447ecf9" (UID: "e42a9a14-8149-4b82-8530-3e770447ecf9"). InnerVolumeSpecName "kube-api-access-zx6sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:06:00 crc kubenswrapper[4771]: I0319 16:06:00.912598 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565606-rpgg5"] Mar 19 16:06:00 crc kubenswrapper[4771]: W0319 16:06:00.922064 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef84e2f9_ca97_4b5e_adaa_c8d337e7b0f1.slice/crio-3c5259b84ce6e23d8a3deb91d26151071eba25de6294d25618733e54ba51ed9b WatchSource:0}: Error finding container 3c5259b84ce6e23d8a3deb91d26151071eba25de6294d25618733e54ba51ed9b: Status 404 returned error can't find the container with id 3c5259b84ce6e23d8a3deb91d26151071eba25de6294d25618733e54ba51ed9b Mar 19 16:06:00 crc kubenswrapper[4771]: I0319 16:06:00.922869 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zx6sm\" (UniqueName: \"kubernetes.io/projected/e42a9a14-8149-4b82-8530-3e770447ecf9-kube-api-access-zx6sm\") on node \"crc\" DevicePath \"\"" Mar 19 16:06:01 crc kubenswrapper[4771]: I0319 16:06:01.520348 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e42a9a14-8149-4b82-8530-3e770447ecf9" path="/var/lib/kubelet/pods/e42a9a14-8149-4b82-8530-3e770447ecf9/volumes" Mar 19 16:06:01 crc kubenswrapper[4771]: I0319 16:06:01.636000 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565606-rpgg5" event={"ID":"ef84e2f9-ca97-4b5e-adaa-c8d337e7b0f1","Type":"ContainerStarted","Data":"3c5259b84ce6e23d8a3deb91d26151071eba25de6294d25618733e54ba51ed9b"} Mar 19 16:06:01 crc kubenswrapper[4771]: I0319 16:06:01.637405 4771 scope.go:117] "RemoveContainer" containerID="1a27177ece82ade292d61589b58a98dec0d5d2382505a91e192809e068867c0d" Mar 19 16:06:01 crc kubenswrapper[4771]: I0319 16:06:01.637486 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vxq2j/crc-debug-jtr4t" Mar 19 16:06:01 crc kubenswrapper[4771]: I0319 16:06:01.905902 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vxq2j/crc-debug-kxg9n"] Mar 19 16:06:01 crc kubenswrapper[4771]: E0319 16:06:01.906702 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e42a9a14-8149-4b82-8530-3e770447ecf9" containerName="container-00" Mar 19 16:06:01 crc kubenswrapper[4771]: I0319 16:06:01.906717 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e42a9a14-8149-4b82-8530-3e770447ecf9" containerName="container-00" Mar 19 16:06:01 crc kubenswrapper[4771]: I0319 16:06:01.906883 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e42a9a14-8149-4b82-8530-3e770447ecf9" containerName="container-00" Mar 19 16:06:01 crc kubenswrapper[4771]: I0319 16:06:01.907383 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vxq2j/crc-debug-kxg9n" Mar 19 16:06:01 crc kubenswrapper[4771]: I0319 16:06:01.909191 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-vxq2j"/"default-dockercfg-sj7jj" Mar 19 16:06:01 crc kubenswrapper[4771]: I0319 16:06:01.957546 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d3ac11f8-27e8-4a7d-b019-15427228ae4e-host\") pod \"crc-debug-kxg9n\" (UID: \"d3ac11f8-27e8-4a7d-b019-15427228ae4e\") " pod="openshift-must-gather-vxq2j/crc-debug-kxg9n" Mar 19 16:06:01 crc kubenswrapper[4771]: I0319 16:06:01.957606 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w76vp\" (UniqueName: \"kubernetes.io/projected/d3ac11f8-27e8-4a7d-b019-15427228ae4e-kube-api-access-w76vp\") pod \"crc-debug-kxg9n\" (UID: \"d3ac11f8-27e8-4a7d-b019-15427228ae4e\") " pod="openshift-must-gather-vxq2j/crc-debug-kxg9n" Mar 19 16:06:02 crc kubenswrapper[4771]: I0319 16:06:02.059831 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d3ac11f8-27e8-4a7d-b019-15427228ae4e-host\") pod \"crc-debug-kxg9n\" (UID: \"d3ac11f8-27e8-4a7d-b019-15427228ae4e\") " pod="openshift-must-gather-vxq2j/crc-debug-kxg9n" Mar 19 16:06:02 crc kubenswrapper[4771]: I0319 16:06:02.059887 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w76vp\" (UniqueName: \"kubernetes.io/projected/d3ac11f8-27e8-4a7d-b019-15427228ae4e-kube-api-access-w76vp\") pod \"crc-debug-kxg9n\" (UID: \"d3ac11f8-27e8-4a7d-b019-15427228ae4e\") " pod="openshift-must-gather-vxq2j/crc-debug-kxg9n" Mar 19 16:06:02 crc kubenswrapper[4771]: I0319 16:06:02.060021 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d3ac11f8-27e8-4a7d-b019-15427228ae4e-host\") pod \"crc-debug-kxg9n\" (UID: \"d3ac11f8-27e8-4a7d-b019-15427228ae4e\") " pod="openshift-must-gather-vxq2j/crc-debug-kxg9n" Mar 19 16:06:02 crc kubenswrapper[4771]: I0319 16:06:02.088001 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w76vp\" (UniqueName: \"kubernetes.io/projected/d3ac11f8-27e8-4a7d-b019-15427228ae4e-kube-api-access-w76vp\") pod \"crc-debug-kxg9n\" (UID: \"d3ac11f8-27e8-4a7d-b019-15427228ae4e\") " pod="openshift-must-gather-vxq2j/crc-debug-kxg9n" Mar 19 16:06:02 crc kubenswrapper[4771]: I0319 16:06:02.222645 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vxq2j/crc-debug-kxg9n" Mar 19 16:06:02 crc kubenswrapper[4771]: I0319 16:06:02.645274 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565606-rpgg5" event={"ID":"ef84e2f9-ca97-4b5e-adaa-c8d337e7b0f1","Type":"ContainerStarted","Data":"76b846dee7526d06a452ddead1528d6db39bb5945e1441b8b6aac9a5efb7a701"} Mar 19 16:06:02 crc kubenswrapper[4771]: I0319 16:06:02.649629 4771 generic.go:334] "Generic (PLEG): container finished" podID="d3ac11f8-27e8-4a7d-b019-15427228ae4e" containerID="1f888f5f6c958c6ebb56d5023b55a9e29a06b213c89c45cacd8f7a7ad80b9a69" exitCode=1 Mar 19 16:06:02 crc kubenswrapper[4771]: I0319 16:06:02.649668 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vxq2j/crc-debug-kxg9n" event={"ID":"d3ac11f8-27e8-4a7d-b019-15427228ae4e","Type":"ContainerDied","Data":"1f888f5f6c958c6ebb56d5023b55a9e29a06b213c89c45cacd8f7a7ad80b9a69"} Mar 19 16:06:02 crc kubenswrapper[4771]: I0319 16:06:02.649690 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vxq2j/crc-debug-kxg9n" event={"ID":"d3ac11f8-27e8-4a7d-b019-15427228ae4e","Type":"ContainerStarted","Data":"b47a525465366576cf921c4a665ab75e92116bb01e7da50646c0af60af046a0c"} Mar 19 16:06:02 crc kubenswrapper[4771]: I0319 16:06:02.662670 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565606-rpgg5" podStartSLOduration=1.470016972 podStartE2EDuration="2.662652914s" podCreationTimestamp="2026-03-19 16:06:00 +0000 UTC" firstStartedPulling="2026-03-19 16:06:00.924209997 +0000 UTC m=+3020.152831199" lastFinishedPulling="2026-03-19 16:06:02.116845939 +0000 UTC m=+3021.345467141" observedRunningTime="2026-03-19 16:06:02.657255462 +0000 UTC m=+3021.885876684" watchObservedRunningTime="2026-03-19 16:06:02.662652914 +0000 UTC m=+3021.891274116" Mar 19 16:06:02 crc kubenswrapper[4771]: I0319 16:06:02.690254 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vxq2j/crc-debug-kxg9n"] Mar 19 16:06:02 crc kubenswrapper[4771]: I0319 16:06:02.699367 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vxq2j/crc-debug-kxg9n"] Mar 19 16:06:03 crc kubenswrapper[4771]: I0319 16:06:03.659854 4771 generic.go:334] "Generic (PLEG): container finished" podID="ef84e2f9-ca97-4b5e-adaa-c8d337e7b0f1" containerID="76b846dee7526d06a452ddead1528d6db39bb5945e1441b8b6aac9a5efb7a701" exitCode=0 Mar 19 16:06:03 crc kubenswrapper[4771]: I0319 16:06:03.659954 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565606-rpgg5" event={"ID":"ef84e2f9-ca97-4b5e-adaa-c8d337e7b0f1","Type":"ContainerDied","Data":"76b846dee7526d06a452ddead1528d6db39bb5945e1441b8b6aac9a5efb7a701"} Mar 19 16:06:03 crc kubenswrapper[4771]: I0319 16:06:03.757192 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vxq2j/crc-debug-kxg9n" Mar 19 16:06:03 crc kubenswrapper[4771]: I0319 16:06:03.800330 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w76vp\" (UniqueName: \"kubernetes.io/projected/d3ac11f8-27e8-4a7d-b019-15427228ae4e-kube-api-access-w76vp\") pod \"d3ac11f8-27e8-4a7d-b019-15427228ae4e\" (UID: \"d3ac11f8-27e8-4a7d-b019-15427228ae4e\") " Mar 19 16:06:03 crc kubenswrapper[4771]: I0319 16:06:03.800468 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d3ac11f8-27e8-4a7d-b019-15427228ae4e-host\") pod \"d3ac11f8-27e8-4a7d-b019-15427228ae4e\" (UID: \"d3ac11f8-27e8-4a7d-b019-15427228ae4e\") " Mar 19 16:06:03 crc kubenswrapper[4771]: I0319 16:06:03.800898 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3ac11f8-27e8-4a7d-b019-15427228ae4e-host" (OuterVolumeSpecName: "host") pod "d3ac11f8-27e8-4a7d-b019-15427228ae4e" (UID: "d3ac11f8-27e8-4a7d-b019-15427228ae4e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:06:03 crc kubenswrapper[4771]: I0319 16:06:03.806795 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3ac11f8-27e8-4a7d-b019-15427228ae4e-kube-api-access-w76vp" (OuterVolumeSpecName: "kube-api-access-w76vp") pod "d3ac11f8-27e8-4a7d-b019-15427228ae4e" (UID: "d3ac11f8-27e8-4a7d-b019-15427228ae4e"). InnerVolumeSpecName "kube-api-access-w76vp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:06:03 crc kubenswrapper[4771]: I0319 16:06:03.902238 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w76vp\" (UniqueName: \"kubernetes.io/projected/d3ac11f8-27e8-4a7d-b019-15427228ae4e-kube-api-access-w76vp\") on node \"crc\" DevicePath \"\"" Mar 19 16:06:03 crc kubenswrapper[4771]: I0319 16:06:03.902315 4771 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d3ac11f8-27e8-4a7d-b019-15427228ae4e-host\") on node \"crc\" DevicePath \"\"" Mar 19 16:06:04 crc kubenswrapper[4771]: I0319 16:06:04.510220 4771 scope.go:117] "RemoveContainer" containerID="968b4f95c0b111096835bcde7002c8b81ceeae3af6b39b2a7dae5f7403f2ce35" Mar 19 16:06:04 crc kubenswrapper[4771]: E0319 16:06:04.510757 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 16:06:04 crc kubenswrapper[4771]: I0319 16:06:04.672285 4771 scope.go:117] "RemoveContainer" containerID="1f888f5f6c958c6ebb56d5023b55a9e29a06b213c89c45cacd8f7a7ad80b9a69" Mar 19 16:06:04 crc kubenswrapper[4771]: I0319 16:06:04.672275 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vxq2j/crc-debug-kxg9n" Mar 19 16:06:05 crc kubenswrapper[4771]: I0319 16:06:05.519959 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3ac11f8-27e8-4a7d-b019-15427228ae4e" path="/var/lib/kubelet/pods/d3ac11f8-27e8-4a7d-b019-15427228ae4e/volumes" Mar 19 16:06:10 crc kubenswrapper[4771]: I0319 16:06:10.949235 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565606-rpgg5" Mar 19 16:06:11 crc kubenswrapper[4771]: I0319 16:06:11.027025 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g78m\" (UniqueName: \"kubernetes.io/projected/ef84e2f9-ca97-4b5e-adaa-c8d337e7b0f1-kube-api-access-8g78m\") pod \"ef84e2f9-ca97-4b5e-adaa-c8d337e7b0f1\" (UID: \"ef84e2f9-ca97-4b5e-adaa-c8d337e7b0f1\") " Mar 19 16:06:11 crc kubenswrapper[4771]: I0319 16:06:11.034207 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef84e2f9-ca97-4b5e-adaa-c8d337e7b0f1-kube-api-access-8g78m" (OuterVolumeSpecName: "kube-api-access-8g78m") pod "ef84e2f9-ca97-4b5e-adaa-c8d337e7b0f1" (UID: "ef84e2f9-ca97-4b5e-adaa-c8d337e7b0f1"). InnerVolumeSpecName "kube-api-access-8g78m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:06:11 crc kubenswrapper[4771]: I0319 16:06:11.129384 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g78m\" (UniqueName: \"kubernetes.io/projected/ef84e2f9-ca97-4b5e-adaa-c8d337e7b0f1-kube-api-access-8g78m\") on node \"crc\" DevicePath \"\"" Mar 19 16:06:11 crc kubenswrapper[4771]: I0319 16:06:11.729820 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9jtj" event={"ID":"ec3538b3-4651-4029-9e15-c151201f08a0","Type":"ContainerStarted","Data":"407868f2f3496f135e41fa7ff3e60fa85a8c138de8db037583b01b2d372e68d1"} Mar 19 16:06:11 crc kubenswrapper[4771]: I0319 16:06:11.732042 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565606-rpgg5" event={"ID":"ef84e2f9-ca97-4b5e-adaa-c8d337e7b0f1","Type":"ContainerDied","Data":"3c5259b84ce6e23d8a3deb91d26151071eba25de6294d25618733e54ba51ed9b"} Mar 19 16:06:11 crc kubenswrapper[4771]: I0319 16:06:11.732065 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c5259b84ce6e23d8a3deb91d26151071eba25de6294d25618733e54ba51ed9b" Mar 19 16:06:11 crc kubenswrapper[4771]: I0319 16:06:11.732067 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565606-rpgg5" Mar 19 16:06:12 crc kubenswrapper[4771]: I0319 16:06:12.022713 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565600-vtrl5"] Mar 19 16:06:12 crc kubenswrapper[4771]: I0319 16:06:12.028366 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565600-vtrl5"] Mar 19 16:06:12 crc kubenswrapper[4771]: I0319 16:06:12.508341 4771 scope.go:117] "RemoveContainer" containerID="2cb341bcaa4357e1a144f46d483a87bcf66466390a7b674cff57c3727c150855" Mar 19 16:06:12 crc kubenswrapper[4771]: E0319 16:06:12.508792 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:06:12 crc kubenswrapper[4771]: I0319 16:06:12.741652 4771 generic.go:334] "Generic (PLEG): container finished" podID="ec3538b3-4651-4029-9e15-c151201f08a0" containerID="407868f2f3496f135e41fa7ff3e60fa85a8c138de8db037583b01b2d372e68d1" exitCode=0 Mar 19 16:06:12 crc kubenswrapper[4771]: I0319 16:06:12.741703 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9jtj" event={"ID":"ec3538b3-4651-4029-9e15-c151201f08a0","Type":"ContainerDied","Data":"407868f2f3496f135e41fa7ff3e60fa85a8c138de8db037583b01b2d372e68d1"} Mar 19 16:06:13 crc kubenswrapper[4771]: I0319 16:06:13.520400 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3530f179-780f-44a0-81f5-fb76a255fc0d" path="/var/lib/kubelet/pods/3530f179-780f-44a0-81f5-fb76a255fc0d/volumes" Mar 19 16:06:13 crc kubenswrapper[4771]: I0319 16:06:13.757239 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9jtj" event={"ID":"ec3538b3-4651-4029-9e15-c151201f08a0","Type":"ContainerStarted","Data":"8c1b1551bcd009ab590afd47bafb5962421a3e6a5d9674e82bdf19124746bd59"} Mar 19 16:06:13 crc kubenswrapper[4771]: I0319 16:06:13.786103 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c9jtj" podStartSLOduration=3.875292471 podStartE2EDuration="19.786082895s" podCreationTimestamp="2026-03-19 16:05:54 +0000 UTC" firstStartedPulling="2026-03-19 16:05:57.59694581 +0000 UTC m=+3016.825567002" lastFinishedPulling="2026-03-19 16:06:13.507736224 +0000 UTC m=+3032.736357426" observedRunningTime="2026-03-19 16:06:13.785197873 +0000 UTC m=+3033.013819095" watchObservedRunningTime="2026-03-19 16:06:13.786082895 +0000 UTC m=+3033.014704097" Mar 19 16:06:14 crc kubenswrapper[4771]: I0319 16:06:14.946691 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c9jtj" Mar 19 16:06:14 crc kubenswrapper[4771]: I0319 16:06:14.948021 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c9jtj" Mar 19 16:06:15 crc kubenswrapper[4771]: I0319 16:06:15.508350 4771 scope.go:117] "RemoveContainer" containerID="a6abaaf08e503f1ee61302d3d79f5d8a6acef6453d41a9fac95abf163411a032" Mar 19 16:06:15 crc kubenswrapper[4771]: E0319 16:06:15.508906 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:06:15 crc kubenswrapper[4771]: I0319 16:06:15.998230 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c9jtj" podUID="ec3538b3-4651-4029-9e15-c151201f08a0" containerName="registry-server" probeResult="failure" output=< Mar 19 16:06:15 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Mar 19 16:06:15 crc kubenswrapper[4771]: > Mar 19 16:06:17 crc kubenswrapper[4771]: I0319 16:06:17.510797 4771 scope.go:117] "RemoveContainer" containerID="968b4f95c0b111096835bcde7002c8b81ceeae3af6b39b2a7dae5f7403f2ce35" Mar 19 16:06:17 crc kubenswrapper[4771]: E0319 16:06:17.511077 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 16:06:24 crc kubenswrapper[4771]: I0319 16:06:24.297795 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7ff5475cc9-hpc64_ac02ff02-302b-4ee5-98d2-59153e9f8d48/init/0.log" Mar 19 16:06:24 crc kubenswrapper[4771]: I0319 16:06:24.508191 4771 scope.go:117] "RemoveContainer" containerID="2cb341bcaa4357e1a144f46d483a87bcf66466390a7b674cff57c3727c150855" Mar 19 16:06:24 crc kubenswrapper[4771]: E0319 16:06:24.508472 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:06:24 crc kubenswrapper[4771]: I0319 16:06:24.526343 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7ff5475cc9-hpc64_ac02ff02-302b-4ee5-98d2-59153e9f8d48/dnsmasq-dns/0.log" Mar 19 16:06:24 crc kubenswrapper[4771]: I0319 16:06:24.544971 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7ff5475cc9-hpc64_ac02ff02-302b-4ee5-98d2-59153e9f8d48/init/0.log" Mar 19 16:06:24 crc kubenswrapper[4771]: I0319 16:06:24.555782 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_49580579-6baf-4e6b-85e1-0dba0fb59d97/kube-state-metrics/0.log" Mar 19 16:06:24 crc kubenswrapper[4771]: I0319 16:06:24.782555 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_daa4604a-2110-4000-a893-d7f308d29bce/mysql-bootstrap/0.log" Mar 19 16:06:24 crc kubenswrapper[4771]: I0319 16:06:24.822536 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_1e06921b-f2eb-4ef0-8256-405214a269e0/memcached/0.log" Mar 19 16:06:24 crc kubenswrapper[4771]: I0319 16:06:24.968900 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_daa4604a-2110-4000-a893-d7f308d29bce/mysql-bootstrap/0.log" Mar 19 16:06:24 crc kubenswrapper[4771]: I0319 16:06:24.983187 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_daa4604a-2110-4000-a893-d7f308d29bce/galera/0.log" Mar 19 16:06:24 crc kubenswrapper[4771]: I0319 16:06:24.998435 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c9jtj" Mar 19 16:06:25 crc kubenswrapper[4771]: I0319 16:06:25.046177 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c9jtj" Mar 19 16:06:25 crc kubenswrapper[4771]: I0319 16:06:25.072906 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_66d10600-3f91-4e77-a751-7f6fbe7148ea/mysql-bootstrap/0.log" Mar 19 16:06:25 crc kubenswrapper[4771]: I0319 16:06:25.214115 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_66d10600-3f91-4e77-a751-7f6fbe7148ea/mysql-bootstrap/0.log" Mar 19 16:06:25 crc kubenswrapper[4771]: I0319 16:06:25.265404 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_66d10600-3f91-4e77-a751-7f6fbe7148ea/galera/0.log" Mar 19 16:06:25 crc kubenswrapper[4771]: I0319 16:06:25.279595 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-lrwtz_def3f27c-03ff-4f92-895a-b3fb6ea64130/openstack-network-exporter/0.log" Mar 19 16:06:25 crc kubenswrapper[4771]: I0319 16:06:25.401751 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rs8tv_9af24d0b-354c-4109-a11a-2e56c65b8b0a/ovsdb-server-init/0.log" Mar 19 16:06:25 crc kubenswrapper[4771]: I0319 16:06:25.576432 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c9jtj"] Mar 19 16:06:25 crc kubenswrapper[4771]: I0319 16:06:25.668796 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rs8tv_9af24d0b-354c-4109-a11a-2e56c65b8b0a/ovs-vswitchd/0.log" Mar 19 16:06:25 crc kubenswrapper[4771]: I0319 16:06:25.690873 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rs8tv_9af24d0b-354c-4109-a11a-2e56c65b8b0a/ovsdb-server-init/0.log" Mar 19 16:06:25 crc kubenswrapper[4771]: I0319 16:06:25.712600 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rs8tv_9af24d0b-354c-4109-a11a-2e56c65b8b0a/ovsdb-server/0.log" Mar 19 16:06:25 crc kubenswrapper[4771]: I0319 16:06:25.748467 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t9tdl"] Mar 19 16:06:25 crc kubenswrapper[4771]: I0319 16:06:25.748681 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t9tdl" podUID="31742b90-a657-49d5-be4a-7e415211fc0a" containerName="registry-server" containerID="cri-o://d3e3e8d3e84c8e3981306c42dc3d268d97ee7feaf61ebe183b247fbb77d4cf9a" gracePeriod=2 Mar 19 16:06:25 crc kubenswrapper[4771]: I0319 16:06:25.839601 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-w5jsx_e8c6ed19-f258-4da0-966a-6c538b85dce1/ovn-controller/0.log" Mar 19 16:06:25 crc kubenswrapper[4771]: I0319 16:06:25.979445 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_cbbec9e9-0922-4f52-aafc-409365715a4a/ovn-northd/0.log" Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:25.999971 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_cbbec9e9-0922-4f52-aafc-409365715a4a/openstack-network-exporter/0.log" Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.231162 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t9tdl" Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.269917 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_71aa1a31-80b5-40d9-9549-f12b2f0c34aa/openstack-network-exporter/0.log" Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.324296 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_71aa1a31-80b5-40d9-9549-f12b2f0c34aa/ovsdbserver-nb/0.log" Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.379032 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31742b90-a657-49d5-be4a-7e415211fc0a-utilities\") pod \"31742b90-a657-49d5-be4a-7e415211fc0a\" (UID: \"31742b90-a657-49d5-be4a-7e415211fc0a\") " Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.379075 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31742b90-a657-49d5-be4a-7e415211fc0a-catalog-content\") pod \"31742b90-a657-49d5-be4a-7e415211fc0a\" (UID: \"31742b90-a657-49d5-be4a-7e415211fc0a\") " Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.379273 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f/openstack-network-exporter/0.log" Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.379282 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31742b90-a657-49d5-be4a-7e415211fc0a-utilities" (OuterVolumeSpecName: "utilities") pod "31742b90-a657-49d5-be4a-7e415211fc0a" (UID: "31742b90-a657-49d5-be4a-7e415211fc0a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.379362 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nm289\" (UniqueName: \"kubernetes.io/projected/31742b90-a657-49d5-be4a-7e415211fc0a-kube-api-access-nm289\") pod \"31742b90-a657-49d5-be4a-7e415211fc0a\" (UID: \"31742b90-a657-49d5-be4a-7e415211fc0a\") " Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.379703 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31742b90-a657-49d5-be4a-7e415211fc0a-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.388117 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31742b90-a657-49d5-be4a-7e415211fc0a-kube-api-access-nm289" (OuterVolumeSpecName: "kube-api-access-nm289") pod "31742b90-a657-49d5-be4a-7e415211fc0a" (UID: "31742b90-a657-49d5-be4a-7e415211fc0a"). InnerVolumeSpecName "kube-api-access-nm289". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.480674 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nm289\" (UniqueName: \"kubernetes.io/projected/31742b90-a657-49d5-be4a-7e415211fc0a-kube-api-access-nm289\") on node \"crc\" DevicePath \"\"" Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.491840 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31742b90-a657-49d5-be4a-7e415211fc0a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31742b90-a657-49d5-be4a-7e415211fc0a" (UID: "31742b90-a657-49d5-be4a-7e415211fc0a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.534029 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7e1c08cf-2d38-438c-b2b0-6e5b4f0b728f/ovsdbserver-sb/0.log" Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.573751 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c065c328-37e2-4905-9d1e-82208eab196e/setup-container/0.log" Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.581763 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31742b90-a657-49d5-be4a-7e415211fc0a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.827709 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c065c328-37e2-4905-9d1e-82208eab196e/rabbitmq/10.log" Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.829283 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c065c328-37e2-4905-9d1e-82208eab196e/rabbitmq/10.log" Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.846862 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c065c328-37e2-4905-9d1e-82208eab196e/setup-container/0.log" Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.869906 4771 generic.go:334] "Generic (PLEG): container finished" podID="31742b90-a657-49d5-be4a-7e415211fc0a" containerID="d3e3e8d3e84c8e3981306c42dc3d268d97ee7feaf61ebe183b247fbb77d4cf9a" exitCode=0 Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.869958 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t9tdl" event={"ID":"31742b90-a657-49d5-be4a-7e415211fc0a","Type":"ContainerDied","Data":"d3e3e8d3e84c8e3981306c42dc3d268d97ee7feaf61ebe183b247fbb77d4cf9a"} Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.870054 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t9tdl" event={"ID":"31742b90-a657-49d5-be4a-7e415211fc0a","Type":"ContainerDied","Data":"41114c4dfa2576676d7f7a88e94e55f6d4ea1bfeac75cbb10329cb85bc0dadde"} Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.870078 4771 scope.go:117] "RemoveContainer" containerID="d3e3e8d3e84c8e3981306c42dc3d268d97ee7feaf61ebe183b247fbb77d4cf9a" Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.869975 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t9tdl" Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.897677 4771 scope.go:117] "RemoveContainer" containerID="008ee36621c45fddc076dfea2c90a4979dadcf5df0155e4f67d8754f43b3db4f" Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.900503 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t9tdl"] Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.911232 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t9tdl"] Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.934657 4771 scope.go:117] "RemoveContainer" containerID="0ce82589d1ff93ebaffbcf52b5e42d3f52fd74be9478f8859f546a2c08dd2bb0" Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.983313 4771 scope.go:117] "RemoveContainer" containerID="d3e3e8d3e84c8e3981306c42dc3d268d97ee7feaf61ebe183b247fbb77d4cf9a" Mar 19 16:06:26 crc kubenswrapper[4771]: E0319 16:06:26.984650 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3e3e8d3e84c8e3981306c42dc3d268d97ee7feaf61ebe183b247fbb77d4cf9a\": container with ID starting with d3e3e8d3e84c8e3981306c42dc3d268d97ee7feaf61ebe183b247fbb77d4cf9a not found: ID does not exist" containerID="d3e3e8d3e84c8e3981306c42dc3d268d97ee7feaf61ebe183b247fbb77d4cf9a" Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.984704 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3e3e8d3e84c8e3981306c42dc3d268d97ee7feaf61ebe183b247fbb77d4cf9a"} err="failed to get container status \"d3e3e8d3e84c8e3981306c42dc3d268d97ee7feaf61ebe183b247fbb77d4cf9a\": rpc error: code = NotFound desc = could not find container \"d3e3e8d3e84c8e3981306c42dc3d268d97ee7feaf61ebe183b247fbb77d4cf9a\": container with ID starting with d3e3e8d3e84c8e3981306c42dc3d268d97ee7feaf61ebe183b247fbb77d4cf9a not found: ID does not exist" Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.984736 4771 scope.go:117] "RemoveContainer" containerID="008ee36621c45fddc076dfea2c90a4979dadcf5df0155e4f67d8754f43b3db4f" Mar 19 16:06:26 crc kubenswrapper[4771]: E0319 16:06:26.985363 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"008ee36621c45fddc076dfea2c90a4979dadcf5df0155e4f67d8754f43b3db4f\": container with ID starting with 008ee36621c45fddc076dfea2c90a4979dadcf5df0155e4f67d8754f43b3db4f not found: ID does not exist" containerID="008ee36621c45fddc076dfea2c90a4979dadcf5df0155e4f67d8754f43b3db4f" Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.985409 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"008ee36621c45fddc076dfea2c90a4979dadcf5df0155e4f67d8754f43b3db4f"} err="failed to get container status \"008ee36621c45fddc076dfea2c90a4979dadcf5df0155e4f67d8754f43b3db4f\": rpc error: code = NotFound desc = could not find container \"008ee36621c45fddc076dfea2c90a4979dadcf5df0155e4f67d8754f43b3db4f\": container with ID starting with 008ee36621c45fddc076dfea2c90a4979dadcf5df0155e4f67d8754f43b3db4f not found: ID does not exist" Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.985429 4771 scope.go:117] "RemoveContainer" containerID="0ce82589d1ff93ebaffbcf52b5e42d3f52fd74be9478f8859f546a2c08dd2bb0" Mar 19 16:06:26 crc kubenswrapper[4771]: E0319 16:06:26.985755 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ce82589d1ff93ebaffbcf52b5e42d3f52fd74be9478f8859f546a2c08dd2bb0\": container with ID starting with 0ce82589d1ff93ebaffbcf52b5e42d3f52fd74be9478f8859f546a2c08dd2bb0 not found: ID does not exist" containerID="0ce82589d1ff93ebaffbcf52b5e42d3f52fd74be9478f8859f546a2c08dd2bb0" Mar 19 16:06:26 crc kubenswrapper[4771]: I0319 16:06:26.985828 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ce82589d1ff93ebaffbcf52b5e42d3f52fd74be9478f8859f546a2c08dd2bb0"} err="failed to get container status \"0ce82589d1ff93ebaffbcf52b5e42d3f52fd74be9478f8859f546a2c08dd2bb0\": rpc error: code = NotFound desc = could not find container \"0ce82589d1ff93ebaffbcf52b5e42d3f52fd74be9478f8859f546a2c08dd2bb0\": container with ID starting with 0ce82589d1ff93ebaffbcf52b5e42d3f52fd74be9478f8859f546a2c08dd2bb0 not found: ID does not exist" Mar 19 16:06:27 crc kubenswrapper[4771]: I0319 16:06:27.092150 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_74c5f622-0ced-47f9-80d5-75a09acfafc0/setup-container/0.log" Mar 19 16:06:27 crc kubenswrapper[4771]: I0319 16:06:27.283630 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_74c5f622-0ced-47f9-80d5-75a09acfafc0/setup-container/0.log" Mar 19 16:06:27 crc kubenswrapper[4771]: I0319 16:06:27.297623 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_74c5f622-0ced-47f9-80d5-75a09acfafc0/rabbitmq/10.log" Mar 19 16:06:27 crc kubenswrapper[4771]: I0319 16:06:27.303664 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_74c5f622-0ced-47f9-80d5-75a09acfafc0/rabbitmq/10.log" Mar 19 16:06:27 crc kubenswrapper[4771]: I0319 16:06:27.496459 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-7tbkt_fde481a0-3182-4ad8-90ff-7fc8da0ecde2/swift-ring-rebalance/0.log" Mar 19 16:06:27 crc kubenswrapper[4771]: I0319 16:06:27.516843 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31742b90-a657-49d5-be4a-7e415211fc0a" path="/var/lib/kubelet/pods/31742b90-a657-49d5-be4a-7e415211fc0a/volumes" Mar 19 16:06:27 crc kubenswrapper[4771]: I0319 16:06:27.598132 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67d58e24-649b-4142-a62a-64c9919fe0e4/account-reaper/0.log" Mar 19 16:06:27 crc kubenswrapper[4771]: I0319 16:06:27.600291 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67d58e24-649b-4142-a62a-64c9919fe0e4/account-auditor/0.log" Mar 19 16:06:27 crc kubenswrapper[4771]: I0319 16:06:27.720962 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67d58e24-649b-4142-a62a-64c9919fe0e4/account-replicator/0.log" Mar 19 16:06:27 crc kubenswrapper[4771]: I0319 16:06:27.814763 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67d58e24-649b-4142-a62a-64c9919fe0e4/account-server/0.log" Mar 19 16:06:27 crc kubenswrapper[4771]: I0319 16:06:27.861075 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67d58e24-649b-4142-a62a-64c9919fe0e4/container-replicator/0.log" Mar 19 16:06:27 crc kubenswrapper[4771]: I0319 16:06:27.886578 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67d58e24-649b-4142-a62a-64c9919fe0e4/container-auditor/0.log" Mar 19 16:06:27 crc kubenswrapper[4771]: I0319 16:06:27.951702 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67d58e24-649b-4142-a62a-64c9919fe0e4/container-server/0.log" Mar 19 16:06:28 crc kubenswrapper[4771]: I0319 16:06:28.097113 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67d58e24-649b-4142-a62a-64c9919fe0e4/object-auditor/0.log" Mar 19 16:06:28 crc kubenswrapper[4771]: I0319 16:06:28.247400 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67d58e24-649b-4142-a62a-64c9919fe0e4/container-updater/0.log" Mar 19 16:06:28 crc kubenswrapper[4771]: I0319 16:06:28.279278 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67d58e24-649b-4142-a62a-64c9919fe0e4/object-expirer/0.log" Mar 19 16:06:28 crc kubenswrapper[4771]: I0319 16:06:28.342407 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67d58e24-649b-4142-a62a-64c9919fe0e4/object-replicator/0.log" Mar 19 16:06:28 crc kubenswrapper[4771]: I0319 16:06:28.410962 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67d58e24-649b-4142-a62a-64c9919fe0e4/object-server/0.log" Mar 19 16:06:28 crc kubenswrapper[4771]: I0319 16:06:28.448163 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67d58e24-649b-4142-a62a-64c9919fe0e4/object-updater/0.log" Mar 19 16:06:28 crc kubenswrapper[4771]: I0319 16:06:28.508282 4771 scope.go:117] "RemoveContainer" containerID="a6abaaf08e503f1ee61302d3d79f5d8a6acef6453d41a9fac95abf163411a032" Mar 19 16:06:28 crc kubenswrapper[4771]: E0319 16:06:28.508613 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:06:28 crc kubenswrapper[4771]: I0319 16:06:28.546511 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67d58e24-649b-4142-a62a-64c9919fe0e4/rsync/0.log" Mar 19 16:06:28 crc kubenswrapper[4771]: I0319 16:06:28.557975 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67d58e24-649b-4142-a62a-64c9919fe0e4/swift-recon-cron/0.log" Mar 19 16:06:32 crc kubenswrapper[4771]: I0319 16:06:32.509395 4771 scope.go:117] "RemoveContainer" containerID="968b4f95c0b111096835bcde7002c8b81ceeae3af6b39b2a7dae5f7403f2ce35" Mar 19 16:06:32 crc kubenswrapper[4771]: E0319 16:06:32.510240 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 16:06:37 crc kubenswrapper[4771]: I0319 16:06:37.509109 4771 scope.go:117] "RemoveContainer" containerID="2cb341bcaa4357e1a144f46d483a87bcf66466390a7b674cff57c3727c150855" Mar 19 16:06:37 crc kubenswrapper[4771]: E0319 16:06:37.509718 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:06:43 crc kubenswrapper[4771]: I0319 16:06:43.508552 4771 scope.go:117] "RemoveContainer" containerID="a6abaaf08e503f1ee61302d3d79f5d8a6acef6453d41a9fac95abf163411a032" Mar 19 16:06:43 crc kubenswrapper[4771]: E0319 16:06:43.509267 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:06:45 crc kubenswrapper[4771]: I0319 16:06:45.510257 4771 scope.go:117] "RemoveContainer" containerID="968b4f95c0b111096835bcde7002c8b81ceeae3af6b39b2a7dae5f7403f2ce35" Mar 19 16:06:45 crc kubenswrapper[4771]: E0319 16:06:45.510468 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 16:06:46 crc kubenswrapper[4771]: I0319 16:06:46.272444 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-x9hpx_4717b1db-fd1d-4e9f-b04d-b88488b35369/manager/0.log" Mar 19 16:06:46 crc kubenswrapper[4771]: I0319 16:06:46.445223 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4_59ea6c13-a174-4ed2-bd52-8f5af5c11cfb/util/0.log" Mar 19 16:06:46 crc kubenswrapper[4771]: I0319 16:06:46.651894 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4_59ea6c13-a174-4ed2-bd52-8f5af5c11cfb/pull/0.log" Mar 19 16:06:46 crc kubenswrapper[4771]: I0319 16:06:46.656430 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4_59ea6c13-a174-4ed2-bd52-8f5af5c11cfb/pull/0.log" Mar 19 16:06:46 crc kubenswrapper[4771]: I0319 16:06:46.658769 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4_59ea6c13-a174-4ed2-bd52-8f5af5c11cfb/util/0.log" Mar 19 16:06:46 crc kubenswrapper[4771]: I0319 16:06:46.841294 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-xllmw_3701ec62-21e3-4bb7-8e32-c09fb4c5d619/manager/0.log" Mar 19 16:06:47 crc kubenswrapper[4771]: I0319 16:06:47.002236 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4_59ea6c13-a174-4ed2-bd52-8f5af5c11cfb/util/0.log" Mar 19 16:06:47 crc kubenswrapper[4771]: I0319 16:06:47.032323 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4_59ea6c13-a174-4ed2-bd52-8f5af5c11cfb/pull/0.log" Mar 19 16:06:47 crc kubenswrapper[4771]: I0319 16:06:47.054585 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_db506baa4c5084166f84dc05fe1f17a3d471c47b5a3724aa4429125165gs6p4_59ea6c13-a174-4ed2-bd52-8f5af5c11cfb/extract/0.log" Mar 19 16:06:47 crc kubenswrapper[4771]: I0319 16:06:47.227661 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-d4wwp_297266bf-7ed9-43bf-abfa-d608acf96290/manager/0.log" Mar 19 16:06:47 crc kubenswrapper[4771]: I0319 16:06:47.270104 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-9mp28_8d301298-e7fb-4ce7-8369-7d1887b6a913/manager/0.log" Mar 19 16:06:47 crc kubenswrapper[4771]: I0319 16:06:47.397973 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-7p6l9_f10d9b25-6f62-4300-9827-bebe80433dda/manager/0.log" Mar 19 16:06:47 crc kubenswrapper[4771]: I0319 16:06:47.436884 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-hgvwc_bb45e644-93ea-41f8-96b5-bf1765f44488/manager/0.log" Mar 19 16:06:47 crc kubenswrapper[4771]: I0319 16:06:47.670012 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b487c85ff-rdr25_6b6478b8-ae48-43aa-9c8c-d1ee0fbcb992/manager/0.log" Mar 19 16:06:47 crc kubenswrapper[4771]: I0319 16:06:47.696470 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-5xksc_682bc21b-ae46-487c-b1f8-a8626914fff4/manager/0.log" Mar 19 16:06:47 crc kubenswrapper[4771]: I0319 16:06:47.892552 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-t7mdc_3196e589-b895-4c94-aa2b-1b4a1b0786cf/manager/0.log" Mar 19 16:06:47 crc kubenswrapper[4771]: I0319 16:06:47.992822 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-knffk_3fc80c77-8c96-492a-8da1-4e617cfc2878/manager/0.log" Mar 19 16:06:48 crc kubenswrapper[4771]: I0319 16:06:48.103387 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-w4pll_6c6e6d57-bc57-4368-9b9f-ce85dbf99b46/manager/0.log" Mar 19 16:06:48 crc kubenswrapper[4771]: I0319 16:06:48.153607 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-mldfm_8caf5d5b-cffa-4b03-b9b9-7bd54217fda6/manager/0.log" Mar 19 16:06:48 crc kubenswrapper[4771]: I0319 16:06:48.290342 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-5zj7h_051674d9-53cb-4cbc-ae54-b6beb16456ee/manager/0.log" Mar 19 16:06:48 crc kubenswrapper[4771]: I0319 16:06:48.324574 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-xhqbn_14fa11c5-1371-4da1-aa9b-8b7b2463600e/manager/0.log" Mar 19 16:06:48 crc kubenswrapper[4771]: I0319 16:06:48.461032 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-4fwmj_7ccc7b81-b1de-48e8-aec9-f88f615ebf88/manager/0.log" Mar 19 16:06:48 crc kubenswrapper[4771]: I0319 16:06:48.508619 4771 scope.go:117] "RemoveContainer" containerID="2cb341bcaa4357e1a144f46d483a87bcf66466390a7b674cff57c3727c150855" Mar 19 16:06:48 crc kubenswrapper[4771]: E0319 16:06:48.509089 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:06:48 crc kubenswrapper[4771]: I0319 16:06:48.608506 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5655c486f8-5kdjz_ddeeae87-c5db-4209-b4e2-8ad178a811fc/operator/0.log" Mar 19 16:06:48 crc kubenswrapper[4771]: I0319 16:06:48.850453 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-xwfgg_15fecac0-77e3-4e3f-85db-33b3a4fa0232/registry-server/0.log" Mar 19 16:06:48 crc kubenswrapper[4771]: I0319 16:06:48.866579 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-75b679d9b8-4nqx6_3fdbf2a8-2d30-446e-9aae-a7a00f4efd0d/manager/0.log" Mar 19 16:06:49 crc kubenswrapper[4771]: I0319 16:06:49.075907 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-6s7qk_0200604c-cbeb-45ea-9f92-b5f857d05b23/manager/0.log" Mar 19 16:06:49 crc kubenswrapper[4771]: I0319 16:06:49.092590 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-rbjxs_ade0d786-88b7-465a-917a-7147ae923a01/manager/0.log" Mar 19 16:06:49 crc kubenswrapper[4771]: I0319 16:06:49.229895 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-jrbt9_d6cbfd2b-61bd-433e-95e8-8351340d720f/operator/0.log" Mar 19 16:06:49 crc kubenswrapper[4771]: I0319 16:06:49.341125 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-r479m_2a2f4027-c0c8-4032-9e40-ab2ce99c899f/manager/0.log" Mar 19 16:06:49 crc kubenswrapper[4771]: I0319 16:06:49.494756 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-zcdvr_164abed1-8fb0-4276-acc5-08c87a08ba9a/manager/0.log" Mar 19 16:06:49 crc kubenswrapper[4771]: I0319 16:06:49.524627 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-g6nkz_6d221d10-3c53-4137-88fc-8905e46b397c/manager/0.log" Mar 19 16:06:49 crc kubenswrapper[4771]: I0319 16:06:49.667932 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-p4bjm_60385688-48aa-4671-9854-60eb4e36f072/manager/0.log" Mar 19 16:06:54 crc kubenswrapper[4771]: I0319 16:06:54.509478 4771 scope.go:117] "RemoveContainer" containerID="a6abaaf08e503f1ee61302d3d79f5d8a6acef6453d41a9fac95abf163411a032" Mar 19 16:06:54 crc kubenswrapper[4771]: E0319 16:06:54.510197 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:06:58 crc kubenswrapper[4771]: I0319 16:06:58.509755 4771 scope.go:117] "RemoveContainer" containerID="968b4f95c0b111096835bcde7002c8b81ceeae3af6b39b2a7dae5f7403f2ce35" Mar 19 16:06:58 crc kubenswrapper[4771]: E0319 16:06:58.510281 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 16:06:59 crc kubenswrapper[4771]: I0319 16:06:59.505432 4771 scope.go:117] "RemoveContainer" containerID="b905096a4d7d5ef3e2300501e6f714fad43e606505c9a360c770d5f8b007f72a" Mar 19 16:06:59 crc kubenswrapper[4771]: I0319 16:06:59.508889 4771 scope.go:117] "RemoveContainer" containerID="2cb341bcaa4357e1a144f46d483a87bcf66466390a7b674cff57c3727c150855" Mar 19 16:06:59 crc kubenswrapper[4771]: E0319 16:06:59.509271 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:07:06 crc kubenswrapper[4771]: I0319 16:07:06.508443 4771 scope.go:117] "RemoveContainer" containerID="a6abaaf08e503f1ee61302d3d79f5d8a6acef6453d41a9fac95abf163411a032" Mar 19 16:07:06 crc kubenswrapper[4771]: E0319 16:07:06.510584 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:07:09 crc kubenswrapper[4771]: I0319 16:07:09.315554 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-tzhdc_6c6abe54-14f8-429a-9833-9492e274ec41/control-plane-machine-set-operator/0.log" Mar 19 16:07:09 crc kubenswrapper[4771]: I0319 16:07:09.478091 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-2mx6f_f5ae0ccc-a50b-46d1-b887-28840703ab87/kube-rbac-proxy/0.log" Mar 19 16:07:09 crc kubenswrapper[4771]: I0319 16:07:09.512911 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-2mx6f_f5ae0ccc-a50b-46d1-b887-28840703ab87/machine-api-operator/0.log" Mar 19 16:07:11 crc kubenswrapper[4771]: I0319 16:07:11.515351 4771 scope.go:117] "RemoveContainer" containerID="968b4f95c0b111096835bcde7002c8b81ceeae3af6b39b2a7dae5f7403f2ce35" Mar 19 16:07:11 crc kubenswrapper[4771]: E0319 16:07:11.515801 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 16:07:13 crc kubenswrapper[4771]: I0319 16:07:13.509678 4771 scope.go:117] "RemoveContainer" containerID="2cb341bcaa4357e1a144f46d483a87bcf66466390a7b674cff57c3727c150855" Mar 19 16:07:13 crc kubenswrapper[4771]: E0319 16:07:13.510256 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:07:20 crc kubenswrapper[4771]: I0319 16:07:20.509093 4771 scope.go:117] "RemoveContainer" containerID="a6abaaf08e503f1ee61302d3d79f5d8a6acef6453d41a9fac95abf163411a032" Mar 19 16:07:20 crc kubenswrapper[4771]: E0319 16:07:20.509683 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:07:23 crc kubenswrapper[4771]: I0319 16:07:23.037327 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-9wbzz_0e55dcd2-7a64-4176-a47b-4c6ce9b9f663/cert-manager-controller/0.log" Mar 19 16:07:23 crc kubenswrapper[4771]: I0319 16:07:23.235717 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-74jv6_e6f2cd45-d12c-4239-a907-c3481ed379d1/cert-manager-cainjector/0.log" Mar 19 16:07:23 crc kubenswrapper[4771]: I0319 16:07:23.349043 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-p56ts_d528bd2f-48f9-4d23-b858-99febe63243c/cert-manager-webhook/0.log" Mar 19 16:07:25 crc kubenswrapper[4771]: I0319 16:07:25.508815 4771 scope.go:117] "RemoveContainer" containerID="968b4f95c0b111096835bcde7002c8b81ceeae3af6b39b2a7dae5f7403f2ce35" Mar 19 16:07:25 crc kubenswrapper[4771]: E0319 16:07:25.509363 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 16:07:26 crc kubenswrapper[4771]: I0319 16:07:26.508892 4771 scope.go:117] "RemoveContainer" containerID="2cb341bcaa4357e1a144f46d483a87bcf66466390a7b674cff57c3727c150855" Mar 19 16:07:26 crc kubenswrapper[4771]: E0319 16:07:26.509125 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:07:35 crc kubenswrapper[4771]: I0319 16:07:35.509087 4771 scope.go:117] "RemoveContainer" containerID="a6abaaf08e503f1ee61302d3d79f5d8a6acef6453d41a9fac95abf163411a032" Mar 19 16:07:35 crc kubenswrapper[4771]: E0319 16:07:35.509718 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:07:36 crc kubenswrapper[4771]: I0319 16:07:36.508963 4771 scope.go:117] "RemoveContainer" containerID="968b4f95c0b111096835bcde7002c8b81ceeae3af6b39b2a7dae5f7403f2ce35" Mar 19 16:07:36 crc kubenswrapper[4771]: E0319 16:07:36.509578 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 16:07:36 crc kubenswrapper[4771]: I0319 16:07:36.633646 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-2rxdq_0ff074f0-de88-4680-a784-82e407cb6a11/nmstate-console-plugin/0.log" Mar 19 16:07:36 crc kubenswrapper[4771]: I0319 16:07:36.789033 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-ctqt6_c6fd6496-7123-4ff3-adea-c38716b6a50a/nmstate-handler/0.log" Mar 19 16:07:36 crc kubenswrapper[4771]: I0319 16:07:36.799827 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-z466c_b72761bc-3ae8-4464-9544-d0ed1781f1e5/kube-rbac-proxy/0.log" Mar 19 16:07:36 crc kubenswrapper[4771]: I0319 16:07:36.855628 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-z466c_b72761bc-3ae8-4464-9544-d0ed1781f1e5/nmstate-metrics/0.log" Mar 19 16:07:36 crc kubenswrapper[4771]: I0319 16:07:36.960823 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-78vtb_378cb353-a241-4bd4-910e-593931ac24cc/nmstate-operator/0.log" Mar 19 16:07:37 crc kubenswrapper[4771]: I0319 16:07:37.075273 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-m2mxb_f9f6d719-a63b-4b45-a13c-64480e0dcc69/nmstate-webhook/0.log" Mar 19 16:07:40 crc kubenswrapper[4771]: I0319 16:07:40.509069 4771 scope.go:117] "RemoveContainer" containerID="2cb341bcaa4357e1a144f46d483a87bcf66466390a7b674cff57c3727c150855" Mar 19 16:07:41 crc kubenswrapper[4771]: I0319 16:07:41.423391 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c065c328-37e2-4905-9d1e-82208eab196e","Type":"ContainerStarted","Data":"e5c5c66b506efa8694bfa43d4f858b13bff35ced8c256d88612b4333cab5de62"} Mar 19 16:07:41 crc kubenswrapper[4771]: I0319 16:07:41.424320 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:07:45 crc kubenswrapper[4771]: I0319 16:07:45.460514 4771 generic.go:334] "Generic (PLEG): container finished" podID="c065c328-37e2-4905-9d1e-82208eab196e" containerID="e5c5c66b506efa8694bfa43d4f858b13bff35ced8c256d88612b4333cab5de62" exitCode=0 Mar 19 16:07:45 crc kubenswrapper[4771]: I0319 16:07:45.460586 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c065c328-37e2-4905-9d1e-82208eab196e","Type":"ContainerDied","Data":"e5c5c66b506efa8694bfa43d4f858b13bff35ced8c256d88612b4333cab5de62"} Mar 19 16:07:45 crc kubenswrapper[4771]: I0319 16:07:45.460769 4771 scope.go:117] "RemoveContainer" containerID="2cb341bcaa4357e1a144f46d483a87bcf66466390a7b674cff57c3727c150855" Mar 19 16:07:45 crc kubenswrapper[4771]: I0319 16:07:45.461379 4771 scope.go:117] "RemoveContainer" containerID="e5c5c66b506efa8694bfa43d4f858b13bff35ced8c256d88612b4333cab5de62" Mar 19 16:07:45 crc kubenswrapper[4771]: E0319 16:07:45.461670 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:07:47 crc kubenswrapper[4771]: I0319 16:07:47.510073 4771 scope.go:117] "RemoveContainer" containerID="a6abaaf08e503f1ee61302d3d79f5d8a6acef6453d41a9fac95abf163411a032" Mar 19 16:07:48 crc kubenswrapper[4771]: I0319 16:07:48.491041 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74c5f622-0ced-47f9-80d5-75a09acfafc0","Type":"ContainerStarted","Data":"61f71d8f78872ae2069d97734c0786c282834569539d908fa0d4e9fff3931e7f"} Mar 19 16:07:48 crc kubenswrapper[4771]: I0319 16:07:48.491433 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 19 16:07:50 crc kubenswrapper[4771]: I0319 16:07:50.508791 4771 scope.go:117] "RemoveContainer" containerID="968b4f95c0b111096835bcde7002c8b81ceeae3af6b39b2a7dae5f7403f2ce35" Mar 19 16:07:50 crc kubenswrapper[4771]: E0319 16:07:50.509514 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 16:07:52 crc kubenswrapper[4771]: I0319 16:07:52.521975 4771 generic.go:334] "Generic (PLEG): container finished" podID="74c5f622-0ced-47f9-80d5-75a09acfafc0" containerID="61f71d8f78872ae2069d97734c0786c282834569539d908fa0d4e9fff3931e7f" exitCode=0 Mar 19 16:07:52 crc kubenswrapper[4771]: I0319 16:07:52.522036 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74c5f622-0ced-47f9-80d5-75a09acfafc0","Type":"ContainerDied","Data":"61f71d8f78872ae2069d97734c0786c282834569539d908fa0d4e9fff3931e7f"} Mar 19 16:07:52 crc kubenswrapper[4771]: I0319 16:07:52.522109 4771 scope.go:117] "RemoveContainer" containerID="a6abaaf08e503f1ee61302d3d79f5d8a6acef6453d41a9fac95abf163411a032" Mar 19 16:07:52 crc kubenswrapper[4771]: I0319 16:07:52.522895 4771 scope.go:117] "RemoveContainer" containerID="61f71d8f78872ae2069d97734c0786c282834569539d908fa0d4e9fff3931e7f" Mar 19 16:07:52 crc kubenswrapper[4771]: E0319 16:07:52.523173 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:07:55 crc kubenswrapper[4771]: I0319 16:07:55.509498 4771 scope.go:117] "RemoveContainer" containerID="e5c5c66b506efa8694bfa43d4f858b13bff35ced8c256d88612b4333cab5de62" Mar 19 16:07:55 crc kubenswrapper[4771]: E0319 16:07:55.510225 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:08:00 crc kubenswrapper[4771]: I0319 16:08:00.158386 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565608-fhtjq"] Mar 19 16:08:00 crc kubenswrapper[4771]: E0319 16:08:00.159841 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31742b90-a657-49d5-be4a-7e415211fc0a" containerName="registry-server" Mar 19 16:08:00 crc kubenswrapper[4771]: I0319 16:08:00.159873 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="31742b90-a657-49d5-be4a-7e415211fc0a" containerName="registry-server" Mar 19 16:08:00 crc kubenswrapper[4771]: E0319 16:08:00.159913 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31742b90-a657-49d5-be4a-7e415211fc0a" containerName="extract-content" Mar 19 16:08:00 crc kubenswrapper[4771]: I0319 16:08:00.159932 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="31742b90-a657-49d5-be4a-7e415211fc0a" containerName="extract-content" Mar 19 16:08:00 crc kubenswrapper[4771]: E0319 16:08:00.159964 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31742b90-a657-49d5-be4a-7e415211fc0a" containerName="extract-utilities" Mar 19 16:08:00 crc kubenswrapper[4771]: I0319 16:08:00.160018 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="31742b90-a657-49d5-be4a-7e415211fc0a" containerName="extract-utilities" Mar 19 16:08:00 crc kubenswrapper[4771]: E0319 16:08:00.160048 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ac11f8-27e8-4a7d-b019-15427228ae4e" containerName="container-00" Mar 19 16:08:00 crc kubenswrapper[4771]: I0319 16:08:00.160066 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ac11f8-27e8-4a7d-b019-15427228ae4e" containerName="container-00" Mar 19 16:08:00 crc kubenswrapper[4771]: E0319 16:08:00.160089 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef84e2f9-ca97-4b5e-adaa-c8d337e7b0f1" containerName="oc" Mar 19 16:08:00 crc kubenswrapper[4771]: I0319 16:08:00.160106 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef84e2f9-ca97-4b5e-adaa-c8d337e7b0f1" containerName="oc" Mar 19 16:08:00 crc kubenswrapper[4771]: I0319 16:08:00.160511 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3ac11f8-27e8-4a7d-b019-15427228ae4e" containerName="container-00" Mar 19 16:08:00 crc kubenswrapper[4771]: I0319 16:08:00.160562 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="31742b90-a657-49d5-be4a-7e415211fc0a" containerName="registry-server" Mar 19 16:08:00 crc kubenswrapper[4771]: I0319 16:08:00.160601 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef84e2f9-ca97-4b5e-adaa-c8d337e7b0f1" containerName="oc" Mar 19 16:08:00 crc kubenswrapper[4771]: I0319 16:08:00.161861 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565608-fhtjq" Mar 19 16:08:00 crc kubenswrapper[4771]: I0319 16:08:00.165013 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k42k7" Mar 19 16:08:00 crc kubenswrapper[4771]: I0319 16:08:00.165454 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 16:08:00 crc kubenswrapper[4771]: I0319 16:08:00.165579 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 16:08:00 crc kubenswrapper[4771]: I0319 16:08:00.173595 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565608-fhtjq"] Mar 19 16:08:00 crc kubenswrapper[4771]: I0319 16:08:00.273517 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv77x\" (UniqueName: \"kubernetes.io/projected/241d0317-b6ab-4a15-947b-09e42c97b972-kube-api-access-dv77x\") pod \"auto-csr-approver-29565608-fhtjq\" (UID: \"241d0317-b6ab-4a15-947b-09e42c97b972\") " pod="openshift-infra/auto-csr-approver-29565608-fhtjq" Mar 19 16:08:00 crc kubenswrapper[4771]: I0319 16:08:00.375080 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv77x\" (UniqueName: \"kubernetes.io/projected/241d0317-b6ab-4a15-947b-09e42c97b972-kube-api-access-dv77x\") pod \"auto-csr-approver-29565608-fhtjq\" (UID: \"241d0317-b6ab-4a15-947b-09e42c97b972\") " pod="openshift-infra/auto-csr-approver-29565608-fhtjq" Mar 19 16:08:00 crc kubenswrapper[4771]: I0319 16:08:00.397044 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv77x\" (UniqueName: \"kubernetes.io/projected/241d0317-b6ab-4a15-947b-09e42c97b972-kube-api-access-dv77x\") pod \"auto-csr-approver-29565608-fhtjq\" (UID: \"241d0317-b6ab-4a15-947b-09e42c97b972\") " pod="openshift-infra/auto-csr-approver-29565608-fhtjq" Mar 19 16:08:00 crc kubenswrapper[4771]: I0319 16:08:00.483744 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565608-fhtjq" Mar 19 16:08:00 crc kubenswrapper[4771]: I0319 16:08:00.986709 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565608-fhtjq"] Mar 19 16:08:00 crc kubenswrapper[4771]: I0319 16:08:00.994218 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 16:08:01 crc kubenswrapper[4771]: I0319 16:08:01.596843 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565608-fhtjq" event={"ID":"241d0317-b6ab-4a15-947b-09e42c97b972","Type":"ContainerStarted","Data":"a6804f502c90244605d154810f9195d508aa781205bae19ff4da1f4249f81f36"} Mar 19 16:08:02 crc kubenswrapper[4771]: I0319 16:08:02.605705 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565608-fhtjq" event={"ID":"241d0317-b6ab-4a15-947b-09e42c97b972","Type":"ContainerStarted","Data":"f0dbafc0acac551646f13550d51bde6782b791f75451f4b8cc4165b4e7732e48"} Mar 19 16:08:02 crc kubenswrapper[4771]: I0319 16:08:02.634145 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565608-fhtjq" podStartSLOduration=1.400712335 podStartE2EDuration="2.634123309s" podCreationTimestamp="2026-03-19 16:08:00 +0000 UTC" firstStartedPulling="2026-03-19 16:08:00.993904106 +0000 UTC m=+3140.222525318" lastFinishedPulling="2026-03-19 16:08:02.22731508 +0000 UTC m=+3141.455936292" observedRunningTime="2026-03-19 16:08:02.630516282 +0000 UTC m=+3141.859137484" watchObservedRunningTime="2026-03-19 16:08:02.634123309 +0000 UTC m=+3141.862744511" Mar 19 16:08:03 crc kubenswrapper[4771]: I0319 16:08:03.615926 4771 generic.go:334] "Generic (PLEG): container finished" podID="241d0317-b6ab-4a15-947b-09e42c97b972" containerID="f0dbafc0acac551646f13550d51bde6782b791f75451f4b8cc4165b4e7732e48" exitCode=0 Mar 19 16:08:03 crc kubenswrapper[4771]: I0319 16:08:03.616211 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565608-fhtjq" event={"ID":"241d0317-b6ab-4a15-947b-09e42c97b972","Type":"ContainerDied","Data":"f0dbafc0acac551646f13550d51bde6782b791f75451f4b8cc4165b4e7732e48"} Mar 19 16:08:04 crc kubenswrapper[4771]: I0319 16:08:04.508791 4771 scope.go:117] "RemoveContainer" containerID="968b4f95c0b111096835bcde7002c8b81ceeae3af6b39b2a7dae5f7403f2ce35" Mar 19 16:08:04 crc kubenswrapper[4771]: E0319 16:08:04.509105 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 16:08:04 crc kubenswrapper[4771]: I0319 16:08:04.930173 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565608-fhtjq" Mar 19 16:08:05 crc kubenswrapper[4771]: I0319 16:08:05.050818 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv77x\" (UniqueName: \"kubernetes.io/projected/241d0317-b6ab-4a15-947b-09e42c97b972-kube-api-access-dv77x\") pod \"241d0317-b6ab-4a15-947b-09e42c97b972\" (UID: \"241d0317-b6ab-4a15-947b-09e42c97b972\") " Mar 19 16:08:05 crc kubenswrapper[4771]: I0319 16:08:05.058627 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/241d0317-b6ab-4a15-947b-09e42c97b972-kube-api-access-dv77x" (OuterVolumeSpecName: "kube-api-access-dv77x") pod "241d0317-b6ab-4a15-947b-09e42c97b972" (UID: "241d0317-b6ab-4a15-947b-09e42c97b972"). InnerVolumeSpecName "kube-api-access-dv77x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:08:05 crc kubenswrapper[4771]: I0319 16:08:05.139411 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-vd9n2_c6259267-1d63-453e-aefd-5eb03b54f532/kube-rbac-proxy/0.log" Mar 19 16:08:05 crc kubenswrapper[4771]: I0319 16:08:05.152270 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv77x\" (UniqueName: \"kubernetes.io/projected/241d0317-b6ab-4a15-947b-09e42c97b972-kube-api-access-dv77x\") on node \"crc\" DevicePath \"\"" Mar 19 16:08:05 crc kubenswrapper[4771]: I0319 16:08:05.251138 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-vd9n2_c6259267-1d63-453e-aefd-5eb03b54f532/controller/0.log" Mar 19 16:08:05 crc kubenswrapper[4771]: I0319 16:08:05.363170 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-9mlqc_d13f776e-2828-4557-9abf-1d55eab1cf73/frr-k8s-webhook-server/0.log" Mar 19 16:08:05 crc kubenswrapper[4771]: I0319 16:08:05.476360 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-z4zw5_775dcd57-e7c4-416d-a758-a5bb4ccc74ab/cp-frr-files/0.log" Mar 19 16:08:05 crc kubenswrapper[4771]: I0319 16:08:05.509046 4771 scope.go:117] "RemoveContainer" containerID="61f71d8f78872ae2069d97734c0786c282834569539d908fa0d4e9fff3931e7f" Mar 19 16:08:05 crc kubenswrapper[4771]: E0319 16:08:05.509438 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:08:05 crc kubenswrapper[4771]: I0319 16:08:05.635085 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565608-fhtjq" event={"ID":"241d0317-b6ab-4a15-947b-09e42c97b972","Type":"ContainerDied","Data":"a6804f502c90244605d154810f9195d508aa781205bae19ff4da1f4249f81f36"} Mar 19 16:08:05 crc kubenswrapper[4771]: I0319 16:08:05.635122 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565608-fhtjq" Mar 19 16:08:05 crc kubenswrapper[4771]: I0319 16:08:05.635130 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6804f502c90244605d154810f9195d508aa781205bae19ff4da1f4249f81f36" Mar 19 16:08:05 crc kubenswrapper[4771]: I0319 16:08:05.645540 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-z4zw5_775dcd57-e7c4-416d-a758-a5bb4ccc74ab/cp-reloader/0.log" Mar 19 16:08:05 crc kubenswrapper[4771]: I0319 16:08:05.689532 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-z4zw5_775dcd57-e7c4-416d-a758-a5bb4ccc74ab/cp-frr-files/0.log" Mar 19 16:08:05 crc kubenswrapper[4771]: I0319 16:08:05.694586 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-z4zw5_775dcd57-e7c4-416d-a758-a5bb4ccc74ab/cp-reloader/0.log" Mar 19 16:08:05 crc kubenswrapper[4771]: I0319 16:08:05.701288 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-z4zw5_775dcd57-e7c4-416d-a758-a5bb4ccc74ab/cp-metrics/0.log" Mar 19 16:08:05 crc kubenswrapper[4771]: I0319 16:08:05.868855 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-z4zw5_775dcd57-e7c4-416d-a758-a5bb4ccc74ab/cp-metrics/0.log" Mar 19 16:08:05 crc kubenswrapper[4771]: I0319 16:08:05.875602 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-z4zw5_775dcd57-e7c4-416d-a758-a5bb4ccc74ab/cp-frr-files/0.log" Mar 19 16:08:05 crc kubenswrapper[4771]: I0319 16:08:05.902020 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-z4zw5_775dcd57-e7c4-416d-a758-a5bb4ccc74ab/cp-reloader/0.log" Mar 19 16:08:05 crc kubenswrapper[4771]: I0319 16:08:05.913100 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-z4zw5_775dcd57-e7c4-416d-a758-a5bb4ccc74ab/cp-metrics/0.log" Mar 19 16:08:05 crc kubenswrapper[4771]: I0319 16:08:05.993286 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565602-9qsx7"] Mar 19 16:08:05 crc kubenswrapper[4771]: I0319 16:08:05.999706 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565602-9qsx7"] Mar 19 16:08:06 crc kubenswrapper[4771]: I0319 16:08:06.074191 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-z4zw5_775dcd57-e7c4-416d-a758-a5bb4ccc74ab/cp-frr-files/0.log" Mar 19 16:08:06 crc kubenswrapper[4771]: I0319 16:08:06.103703 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-z4zw5_775dcd57-e7c4-416d-a758-a5bb4ccc74ab/controller/0.log" Mar 19 16:08:06 crc kubenswrapper[4771]: I0319 16:08:06.113589 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-z4zw5_775dcd57-e7c4-416d-a758-a5bb4ccc74ab/cp-metrics/0.log" Mar 19 16:08:06 crc kubenswrapper[4771]: I0319 16:08:06.122159 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-z4zw5_775dcd57-e7c4-416d-a758-a5bb4ccc74ab/cp-reloader/0.log" Mar 19 16:08:06 crc kubenswrapper[4771]: I0319 16:08:06.296897 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-z4zw5_775dcd57-e7c4-416d-a758-a5bb4ccc74ab/frr-metrics/0.log" Mar 19 16:08:06 crc kubenswrapper[4771]: I0319 16:08:06.317626 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-z4zw5_775dcd57-e7c4-416d-a758-a5bb4ccc74ab/kube-rbac-proxy/0.log" Mar 19 16:08:06 crc kubenswrapper[4771]: I0319 16:08:06.349149 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-z4zw5_775dcd57-e7c4-416d-a758-a5bb4ccc74ab/kube-rbac-proxy-frr/0.log" Mar 19 16:08:06 crc kubenswrapper[4771]: I0319 16:08:06.453713 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-z4zw5_775dcd57-e7c4-416d-a758-a5bb4ccc74ab/reloader/0.log" Mar 19 16:08:06 crc kubenswrapper[4771]: I0319 16:08:06.620794 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-cc6bc498-fn6sc_6a6a1f64-4048-41cf-a2c6-19eb960fa8ae/manager/0.log" Mar 19 16:08:06 crc kubenswrapper[4771]: I0319 16:08:06.774803 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7597b4dfd5-tz6lb_789091c5-c870-4a79-ab5c-8e42cf14c768/webhook-server/0.log" Mar 19 16:08:06 crc kubenswrapper[4771]: I0319 16:08:06.855281 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-z4zw5_775dcd57-e7c4-416d-a758-a5bb4ccc74ab/frr/0.log" Mar 19 16:08:06 crc kubenswrapper[4771]: I0319 16:08:06.883672 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vltvb_8d2a4955-e0a0-42e0-86f5-4812f49a2553/kube-rbac-proxy/0.log" Mar 19 16:08:07 crc kubenswrapper[4771]: I0319 16:08:07.201091 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vltvb_8d2a4955-e0a0-42e0-86f5-4812f49a2553/speaker/0.log" Mar 19 16:08:07 crc kubenswrapper[4771]: I0319 16:08:07.509182 4771 scope.go:117] "RemoveContainer" containerID="e5c5c66b506efa8694bfa43d4f858b13bff35ced8c256d88612b4333cab5de62" Mar 19 16:08:07 crc kubenswrapper[4771]: E0319 16:08:07.509417 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:08:07 crc kubenswrapper[4771]: I0319 16:08:07.519062 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e824d290-1b4d-410d-8380-acaa5374e4d2" path="/var/lib/kubelet/pods/e824d290-1b4d-410d-8380-acaa5374e4d2/volumes" Mar 19 16:08:18 crc kubenswrapper[4771]: I0319 16:08:18.508743 4771 scope.go:117] "RemoveContainer" containerID="968b4f95c0b111096835bcde7002c8b81ceeae3af6b39b2a7dae5f7403f2ce35" Mar 19 16:08:18 crc kubenswrapper[4771]: E0319 16:08:18.509430 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 16:08:20 crc kubenswrapper[4771]: I0319 16:08:20.509603 4771 scope.go:117] "RemoveContainer" containerID="61f71d8f78872ae2069d97734c0786c282834569539d908fa0d4e9fff3931e7f" Mar 19 16:08:20 crc kubenswrapper[4771]: E0319 16:08:20.509973 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:08:20 crc kubenswrapper[4771]: I0319 16:08:20.510175 4771 scope.go:117] "RemoveContainer" containerID="e5c5c66b506efa8694bfa43d4f858b13bff35ced8c256d88612b4333cab5de62" Mar 19 16:08:20 crc kubenswrapper[4771]: E0319 16:08:20.510406 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:08:20 crc kubenswrapper[4771]: I0319 16:08:20.514152 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2_1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d/util/0.log" Mar 19 16:08:20 crc kubenswrapper[4771]: I0319 16:08:20.737300 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2_1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d/pull/0.log" Mar 19 16:08:20 crc kubenswrapper[4771]: I0319 16:08:20.744871 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2_1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d/util/0.log" Mar 19 16:08:20 crc kubenswrapper[4771]: I0319 16:08:20.766144 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2_1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d/pull/0.log" Mar 19 16:08:20 crc kubenswrapper[4771]: I0319 16:08:20.916882 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2_1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d/util/0.log" Mar 19 16:08:20 crc kubenswrapper[4771]: I0319 16:08:20.938670 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2_1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d/extract/0.log" Mar 19 16:08:20 crc kubenswrapper[4771]: I0319 16:08:20.962485 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745tkg2_1d87d435-fb92-4ec5-8c8b-7c2bf3b9574d/pull/0.log" Mar 19 16:08:21 crc kubenswrapper[4771]: I0319 16:08:21.102734 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp_dda7fbab-2dc2-4bb5-9106-2424eec739d8/util/0.log" Mar 19 16:08:21 crc kubenswrapper[4771]: I0319 16:08:21.369976 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp_dda7fbab-2dc2-4bb5-9106-2424eec739d8/util/0.log" Mar 19 16:08:21 crc kubenswrapper[4771]: I0319 16:08:21.419775 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp_dda7fbab-2dc2-4bb5-9106-2424eec739d8/pull/0.log" Mar 19 16:08:21 crc kubenswrapper[4771]: I0319 16:08:21.435904 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp_dda7fbab-2dc2-4bb5-9106-2424eec739d8/pull/0.log" Mar 19 16:08:21 crc kubenswrapper[4771]: I0319 16:08:21.564144 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp_dda7fbab-2dc2-4bb5-9106-2424eec739d8/util/0.log" Mar 19 16:08:21 crc kubenswrapper[4771]: I0319 16:08:21.580548 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp_dda7fbab-2dc2-4bb5-9106-2424eec739d8/pull/0.log" Mar 19 16:08:21 crc kubenswrapper[4771]: I0319 16:08:21.636966 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dgfpp_dda7fbab-2dc2-4bb5-9106-2424eec739d8/extract/0.log" Mar 19 16:08:21 crc kubenswrapper[4771]: I0319 16:08:21.745918 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s7zdn_8de902f3-47bf-470f-8d12-1d5920226652/extract-utilities/0.log" Mar 19 16:08:21 crc kubenswrapper[4771]: I0319 16:08:21.933630 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s7zdn_8de902f3-47bf-470f-8d12-1d5920226652/extract-utilities/0.log" Mar 19 16:08:21 crc kubenswrapper[4771]: I0319 16:08:21.966747 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s7zdn_8de902f3-47bf-470f-8d12-1d5920226652/extract-content/0.log" Mar 19 16:08:21 crc kubenswrapper[4771]: I0319 16:08:21.972291 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s7zdn_8de902f3-47bf-470f-8d12-1d5920226652/extract-content/0.log" Mar 19 16:08:22 crc kubenswrapper[4771]: I0319 16:08:22.114958 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s7zdn_8de902f3-47bf-470f-8d12-1d5920226652/extract-content/0.log" Mar 19 16:08:22 crc kubenswrapper[4771]: I0319 16:08:22.115099 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s7zdn_8de902f3-47bf-470f-8d12-1d5920226652/extract-utilities/0.log" Mar 19 16:08:22 crc kubenswrapper[4771]: I0319 16:08:22.360420 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gxxjr_cbefac82-3474-453a-a991-2746e6b18cd3/extract-utilities/0.log" Mar 19 16:08:22 crc kubenswrapper[4771]: I0319 16:08:22.546685 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gxxjr_cbefac82-3474-453a-a991-2746e6b18cd3/extract-utilities/0.log" Mar 19 16:08:22 crc kubenswrapper[4771]: I0319 16:08:22.549189 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gxxjr_cbefac82-3474-453a-a991-2746e6b18cd3/extract-content/0.log" Mar 19 16:08:22 crc kubenswrapper[4771]: I0319 16:08:22.580094 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gxxjr_cbefac82-3474-453a-a991-2746e6b18cd3/extract-content/0.log" Mar 19 16:08:22 crc kubenswrapper[4771]: I0319 16:08:22.595723 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s7zdn_8de902f3-47bf-470f-8d12-1d5920226652/registry-server/0.log" Mar 19 16:08:22 crc kubenswrapper[4771]: I0319 16:08:22.766446 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gxxjr_cbefac82-3474-453a-a991-2746e6b18cd3/extract-content/0.log" Mar 19 16:08:22 crc kubenswrapper[4771]: I0319 16:08:22.787519 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gxxjr_cbefac82-3474-453a-a991-2746e6b18cd3/extract-utilities/0.log" Mar 19 16:08:23 crc kubenswrapper[4771]: I0319 16:08:23.030313 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-p8fgq_676c8f4d-415b-4d2d-bc01-2b62ee6c32b5/marketplace-operator/0.log" Mar 19 16:08:23 crc kubenswrapper[4771]: I0319 16:08:23.092802 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7vgqh_ce222397-3f74-43e5-85d9-39ed2aa02daf/extract-utilities/0.log" Mar 19 16:08:23 crc kubenswrapper[4771]: I0319 16:08:23.205177 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gxxjr_cbefac82-3474-453a-a991-2746e6b18cd3/registry-server/0.log" Mar 19 16:08:23 crc kubenswrapper[4771]: I0319 16:08:23.301572 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7vgqh_ce222397-3f74-43e5-85d9-39ed2aa02daf/extract-content/0.log" Mar 19 16:08:23 crc kubenswrapper[4771]: I0319 16:08:23.303597 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7vgqh_ce222397-3f74-43e5-85d9-39ed2aa02daf/extract-content/0.log" Mar 19 16:08:23 crc kubenswrapper[4771]: I0319 16:08:23.311221 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7vgqh_ce222397-3f74-43e5-85d9-39ed2aa02daf/extract-utilities/0.log" Mar 19 16:08:23 crc kubenswrapper[4771]: I0319 16:08:23.503161 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7vgqh_ce222397-3f74-43e5-85d9-39ed2aa02daf/extract-utilities/0.log" Mar 19 16:08:23 crc kubenswrapper[4771]: I0319 16:08:23.542903 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7vgqh_ce222397-3f74-43e5-85d9-39ed2aa02daf/extract-content/0.log" Mar 19 16:08:23 crc kubenswrapper[4771]: I0319 16:08:23.555872 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7vgqh_ce222397-3f74-43e5-85d9-39ed2aa02daf/registry-server/0.log" Mar 19 16:08:23 crc kubenswrapper[4771]: I0319 16:08:23.696211 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c9jtj_ec3538b3-4651-4029-9e15-c151201f08a0/extract-utilities/0.log" Mar 19 16:08:23 crc kubenswrapper[4771]: I0319 16:08:23.900490 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c9jtj_ec3538b3-4651-4029-9e15-c151201f08a0/extract-content/0.log" Mar 19 16:08:23 crc kubenswrapper[4771]: I0319 16:08:23.904339 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c9jtj_ec3538b3-4651-4029-9e15-c151201f08a0/extract-content/0.log" Mar 19 16:08:23 crc kubenswrapper[4771]: I0319 16:08:23.917650 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c9jtj_ec3538b3-4651-4029-9e15-c151201f08a0/extract-utilities/0.log" Mar 19 16:08:24 crc kubenswrapper[4771]: I0319 16:08:24.150108 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c9jtj_ec3538b3-4651-4029-9e15-c151201f08a0/extract-utilities/0.log" Mar 19 16:08:24 crc kubenswrapper[4771]: I0319 16:08:24.172968 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c9jtj_ec3538b3-4651-4029-9e15-c151201f08a0/extract-content/0.log" Mar 19 16:08:24 crc kubenswrapper[4771]: I0319 16:08:24.218630 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c9jtj_ec3538b3-4651-4029-9e15-c151201f08a0/registry-server/0.log" Mar 19 16:08:31 crc kubenswrapper[4771]: I0319 16:08:31.518948 4771 scope.go:117] "RemoveContainer" containerID="e5c5c66b506efa8694bfa43d4f858b13bff35ced8c256d88612b4333cab5de62" Mar 19 16:08:31 crc kubenswrapper[4771]: E0319 16:08:31.519958 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:08:32 crc kubenswrapper[4771]: I0319 16:08:32.508364 4771 scope.go:117] "RemoveContainer" containerID="61f71d8f78872ae2069d97734c0786c282834569539d908fa0d4e9fff3931e7f" Mar 19 16:08:32 crc kubenswrapper[4771]: E0319 16:08:32.508804 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:08:33 crc kubenswrapper[4771]: I0319 16:08:33.511827 4771 scope.go:117] "RemoveContainer" containerID="968b4f95c0b111096835bcde7002c8b81ceeae3af6b39b2a7dae5f7403f2ce35" Mar 19 16:08:33 crc kubenswrapper[4771]: E0319 16:08:33.512090 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 16:08:43 crc kubenswrapper[4771]: I0319 16:08:43.509116 4771 scope.go:117] "RemoveContainer" containerID="e5c5c66b506efa8694bfa43d4f858b13bff35ced8c256d88612b4333cab5de62" Mar 19 16:08:43 crc kubenswrapper[4771]: E0319 16:08:43.509785 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:08:44 crc kubenswrapper[4771]: I0319 16:08:44.508950 4771 scope.go:117] "RemoveContainer" containerID="61f71d8f78872ae2069d97734c0786c282834569539d908fa0d4e9fff3931e7f" Mar 19 16:08:44 crc kubenswrapper[4771]: E0319 16:08:44.509200 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:08:47 crc kubenswrapper[4771]: I0319 16:08:47.509182 4771 scope.go:117] "RemoveContainer" containerID="968b4f95c0b111096835bcde7002c8b81ceeae3af6b39b2a7dae5f7403f2ce35" Mar 19 16:08:47 crc kubenswrapper[4771]: E0319 16:08:47.509691 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 16:08:54 crc kubenswrapper[4771]: I0319 16:08:54.508823 4771 scope.go:117] "RemoveContainer" containerID="e5c5c66b506efa8694bfa43d4f858b13bff35ced8c256d88612b4333cab5de62" Mar 19 16:08:54 crc kubenswrapper[4771]: E0319 16:08:54.510863 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:08:57 crc kubenswrapper[4771]: I0319 16:08:57.509887 4771 scope.go:117] "RemoveContainer" containerID="61f71d8f78872ae2069d97734c0786c282834569539d908fa0d4e9fff3931e7f" Mar 19 16:08:57 crc kubenswrapper[4771]: E0319 16:08:57.510622 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:08:59 crc kubenswrapper[4771]: I0319 16:08:59.691032 4771 scope.go:117] "RemoveContainer" containerID="8f403cb6431ee82ecf8ce8aeaf223d32343e3c7135ce10e25b02efc791ccf1c9" Mar 19 16:09:00 crc kubenswrapper[4771]: I0319 16:09:00.512085 4771 scope.go:117] "RemoveContainer" containerID="968b4f95c0b111096835bcde7002c8b81ceeae3af6b39b2a7dae5f7403f2ce35" Mar 19 16:09:00 crc kubenswrapper[4771]: E0319 16:09:00.512794 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 16:09:07 crc kubenswrapper[4771]: I0319 16:09:07.509259 4771 scope.go:117] "RemoveContainer" containerID="e5c5c66b506efa8694bfa43d4f858b13bff35ced8c256d88612b4333cab5de62" Mar 19 16:09:07 crc kubenswrapper[4771]: E0319 16:09:07.510169 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:09:09 crc kubenswrapper[4771]: I0319 16:09:09.509296 4771 scope.go:117] "RemoveContainer" containerID="61f71d8f78872ae2069d97734c0786c282834569539d908fa0d4e9fff3931e7f" Mar 19 16:09:09 crc kubenswrapper[4771]: E0319 16:09:09.510153 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:09:12 crc kubenswrapper[4771]: I0319 16:09:12.508921 4771 scope.go:117] "RemoveContainer" containerID="968b4f95c0b111096835bcde7002c8b81ceeae3af6b39b2a7dae5f7403f2ce35" Mar 19 16:09:12 crc kubenswrapper[4771]: E0319 16:09:12.510260 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 16:09:19 crc kubenswrapper[4771]: I0319 16:09:19.510300 4771 scope.go:117] "RemoveContainer" containerID="e5c5c66b506efa8694bfa43d4f858b13bff35ced8c256d88612b4333cab5de62" Mar 19 16:09:19 crc kubenswrapper[4771]: E0319 16:09:19.512078 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:09:23 crc kubenswrapper[4771]: I0319 16:09:23.509760 4771 scope.go:117] "RemoveContainer" containerID="61f71d8f78872ae2069d97734c0786c282834569539d908fa0d4e9fff3931e7f" Mar 19 16:09:23 crc kubenswrapper[4771]: E0319 16:09:23.510720 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:09:24 crc kubenswrapper[4771]: I0319 16:09:24.509388 4771 scope.go:117] "RemoveContainer" containerID="968b4f95c0b111096835bcde7002c8b81ceeae3af6b39b2a7dae5f7403f2ce35" Mar 19 16:09:24 crc kubenswrapper[4771]: E0319 16:09:24.509873 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 16:09:32 crc kubenswrapper[4771]: I0319 16:09:32.508798 4771 scope.go:117] "RemoveContainer" containerID="e5c5c66b506efa8694bfa43d4f858b13bff35ced8c256d88612b4333cab5de62" Mar 19 16:09:32 crc kubenswrapper[4771]: E0319 16:09:32.509814 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:09:34 crc kubenswrapper[4771]: I0319 16:09:34.509165 4771 scope.go:117] "RemoveContainer" containerID="61f71d8f78872ae2069d97734c0786c282834569539d908fa0d4e9fff3931e7f" Mar 19 16:09:34 crc kubenswrapper[4771]: E0319 16:09:34.509736 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:09:35 crc kubenswrapper[4771]: I0319 16:09:35.480396 4771 generic.go:334] "Generic (PLEG): container finished" podID="4d98595c-7966-42e7-b19c-ed21464e0c22" containerID="4133e26663e4e53f216eb5cd0730fdd46d6b2d7447530c82a6264fbcbcdd7886" exitCode=0 Mar 19 16:09:35 crc kubenswrapper[4771]: I0319 16:09:35.480543 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vxq2j/must-gather-hmrzh" event={"ID":"4d98595c-7966-42e7-b19c-ed21464e0c22","Type":"ContainerDied","Data":"4133e26663e4e53f216eb5cd0730fdd46d6b2d7447530c82a6264fbcbcdd7886"} Mar 19 16:09:35 crc kubenswrapper[4771]: I0319 16:09:35.481451 4771 scope.go:117] "RemoveContainer" containerID="4133e26663e4e53f216eb5cd0730fdd46d6b2d7447530c82a6264fbcbcdd7886" Mar 19 16:09:35 crc kubenswrapper[4771]: I0319 16:09:35.805410 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vxq2j_must-gather-hmrzh_4d98595c-7966-42e7-b19c-ed21464e0c22/gather/0.log" Mar 19 16:09:37 crc kubenswrapper[4771]: I0319 16:09:37.510564 4771 scope.go:117] "RemoveContainer" containerID="968b4f95c0b111096835bcde7002c8b81ceeae3af6b39b2a7dae5f7403f2ce35" Mar 19 16:09:37 crc kubenswrapper[4771]: E0319 16:09:37.511110 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 16:09:42 crc kubenswrapper[4771]: I0319 16:09:42.597904 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vxq2j/must-gather-hmrzh"] Mar 19 16:09:42 crc kubenswrapper[4771]: I0319 16:09:42.598898 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-vxq2j/must-gather-hmrzh" podUID="4d98595c-7966-42e7-b19c-ed21464e0c22" containerName="copy" containerID="cri-o://cdb3afcc1e2553ec6b9e132532eac6bb9f327d1686d277b5d685f6f26506c565" gracePeriod=2 Mar 19 16:09:42 crc kubenswrapper[4771]: I0319 16:09:42.612677 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vxq2j/must-gather-hmrzh"] Mar 19 16:09:43 crc kubenswrapper[4771]: I0319 16:09:43.000611 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vxq2j_must-gather-hmrzh_4d98595c-7966-42e7-b19c-ed21464e0c22/copy/0.log" Mar 19 16:09:43 crc kubenswrapper[4771]: I0319 16:09:43.001648 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vxq2j/must-gather-hmrzh" Mar 19 16:09:43 crc kubenswrapper[4771]: I0319 16:09:43.098730 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjmfw\" (UniqueName: \"kubernetes.io/projected/4d98595c-7966-42e7-b19c-ed21464e0c22-kube-api-access-xjmfw\") pod \"4d98595c-7966-42e7-b19c-ed21464e0c22\" (UID: \"4d98595c-7966-42e7-b19c-ed21464e0c22\") " Mar 19 16:09:43 crc kubenswrapper[4771]: I0319 16:09:43.099197 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4d98595c-7966-42e7-b19c-ed21464e0c22-must-gather-output\") pod \"4d98595c-7966-42e7-b19c-ed21464e0c22\" (UID: \"4d98595c-7966-42e7-b19c-ed21464e0c22\") " Mar 19 16:09:43 crc kubenswrapper[4771]: I0319 16:09:43.103934 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d98595c-7966-42e7-b19c-ed21464e0c22-kube-api-access-xjmfw" (OuterVolumeSpecName: "kube-api-access-xjmfw") pod "4d98595c-7966-42e7-b19c-ed21464e0c22" (UID: "4d98595c-7966-42e7-b19c-ed21464e0c22"). InnerVolumeSpecName "kube-api-access-xjmfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:09:43 crc kubenswrapper[4771]: I0319 16:09:43.200831 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjmfw\" (UniqueName: \"kubernetes.io/projected/4d98595c-7966-42e7-b19c-ed21464e0c22-kube-api-access-xjmfw\") on node \"crc\" DevicePath \"\"" Mar 19 16:09:43 crc kubenswrapper[4771]: I0319 16:09:43.222834 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d98595c-7966-42e7-b19c-ed21464e0c22-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "4d98595c-7966-42e7-b19c-ed21464e0c22" (UID: "4d98595c-7966-42e7-b19c-ed21464e0c22"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:09:43 crc kubenswrapper[4771]: I0319 16:09:43.302724 4771 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4d98595c-7966-42e7-b19c-ed21464e0c22-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 19 16:09:43 crc kubenswrapper[4771]: I0319 16:09:43.508442 4771 scope.go:117] "RemoveContainer" containerID="e5c5c66b506efa8694bfa43d4f858b13bff35ced8c256d88612b4333cab5de62" Mar 19 16:09:43 crc kubenswrapper[4771]: E0319 16:09:43.508701 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:09:43 crc kubenswrapper[4771]: I0319 16:09:43.528309 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d98595c-7966-42e7-b19c-ed21464e0c22" path="/var/lib/kubelet/pods/4d98595c-7966-42e7-b19c-ed21464e0c22/volumes" Mar 19 16:09:43 crc kubenswrapper[4771]: I0319 16:09:43.579383 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vxq2j_must-gather-hmrzh_4d98595c-7966-42e7-b19c-ed21464e0c22/copy/0.log" Mar 19 16:09:43 crc kubenswrapper[4771]: I0319 16:09:43.579879 4771 generic.go:334] "Generic (PLEG): container finished" podID="4d98595c-7966-42e7-b19c-ed21464e0c22" containerID="cdb3afcc1e2553ec6b9e132532eac6bb9f327d1686d277b5d685f6f26506c565" exitCode=143 Mar 19 16:09:43 crc kubenswrapper[4771]: I0319 16:09:43.579942 4771 scope.go:117] "RemoveContainer" containerID="cdb3afcc1e2553ec6b9e132532eac6bb9f327d1686d277b5d685f6f26506c565" Mar 19 16:09:43 crc kubenswrapper[4771]: I0319 16:09:43.579980 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vxq2j/must-gather-hmrzh" Mar 19 16:09:43 crc kubenswrapper[4771]: I0319 16:09:43.600839 4771 scope.go:117] "RemoveContainer" containerID="4133e26663e4e53f216eb5cd0730fdd46d6b2d7447530c82a6264fbcbcdd7886" Mar 19 16:09:43 crc kubenswrapper[4771]: I0319 16:09:43.686172 4771 scope.go:117] "RemoveContainer" containerID="cdb3afcc1e2553ec6b9e132532eac6bb9f327d1686d277b5d685f6f26506c565" Mar 19 16:09:43 crc kubenswrapper[4771]: E0319 16:09:43.692687 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdb3afcc1e2553ec6b9e132532eac6bb9f327d1686d277b5d685f6f26506c565\": container with ID starting with cdb3afcc1e2553ec6b9e132532eac6bb9f327d1686d277b5d685f6f26506c565 not found: ID does not exist" containerID="cdb3afcc1e2553ec6b9e132532eac6bb9f327d1686d277b5d685f6f26506c565" Mar 19 16:09:43 crc kubenswrapper[4771]: I0319 16:09:43.692741 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdb3afcc1e2553ec6b9e132532eac6bb9f327d1686d277b5d685f6f26506c565"} err="failed to get container status \"cdb3afcc1e2553ec6b9e132532eac6bb9f327d1686d277b5d685f6f26506c565\": rpc error: code = NotFound desc = could not find container \"cdb3afcc1e2553ec6b9e132532eac6bb9f327d1686d277b5d685f6f26506c565\": container with ID starting with cdb3afcc1e2553ec6b9e132532eac6bb9f327d1686d277b5d685f6f26506c565 not found: ID does not exist" Mar 19 16:09:43 crc kubenswrapper[4771]: I0319 16:09:43.692772 4771 scope.go:117] "RemoveContainer" containerID="4133e26663e4e53f216eb5cd0730fdd46d6b2d7447530c82a6264fbcbcdd7886" Mar 19 16:09:43 crc kubenswrapper[4771]: E0319 16:09:43.696770 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4133e26663e4e53f216eb5cd0730fdd46d6b2d7447530c82a6264fbcbcdd7886\": container with ID starting with 4133e26663e4e53f216eb5cd0730fdd46d6b2d7447530c82a6264fbcbcdd7886 not found: ID does not exist" containerID="4133e26663e4e53f216eb5cd0730fdd46d6b2d7447530c82a6264fbcbcdd7886" Mar 19 16:09:43 crc kubenswrapper[4771]: I0319 16:09:43.696800 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4133e26663e4e53f216eb5cd0730fdd46d6b2d7447530c82a6264fbcbcdd7886"} err="failed to get container status \"4133e26663e4e53f216eb5cd0730fdd46d6b2d7447530c82a6264fbcbcdd7886\": rpc error: code = NotFound desc = could not find container \"4133e26663e4e53f216eb5cd0730fdd46d6b2d7447530c82a6264fbcbcdd7886\": container with ID starting with 4133e26663e4e53f216eb5cd0730fdd46d6b2d7447530c82a6264fbcbcdd7886 not found: ID does not exist" Mar 19 16:09:46 crc kubenswrapper[4771]: I0319 16:09:46.509056 4771 scope.go:117] "RemoveContainer" containerID="61f71d8f78872ae2069d97734c0786c282834569539d908fa0d4e9fff3931e7f" Mar 19 16:09:46 crc kubenswrapper[4771]: E0319 16:09:46.509731 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:09:52 crc kubenswrapper[4771]: I0319 16:09:52.509693 4771 scope.go:117] "RemoveContainer" containerID="968b4f95c0b111096835bcde7002c8b81ceeae3af6b39b2a7dae5f7403f2ce35" Mar 19 16:09:52 crc kubenswrapper[4771]: E0319 16:09:52.512551 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 16:09:57 crc kubenswrapper[4771]: I0319 16:09:57.509145 4771 scope.go:117] "RemoveContainer" containerID="e5c5c66b506efa8694bfa43d4f858b13bff35ced8c256d88612b4333cab5de62" Mar 19 16:09:57 crc kubenswrapper[4771]: E0319 16:09:57.510422 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:09:58 crc kubenswrapper[4771]: I0319 16:09:58.509411 4771 scope.go:117] "RemoveContainer" containerID="61f71d8f78872ae2069d97734c0786c282834569539d908fa0d4e9fff3931e7f" Mar 19 16:09:58 crc kubenswrapper[4771]: E0319 16:09:58.509932 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:10:00 crc kubenswrapper[4771]: I0319 16:10:00.179373 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565610-kzlfb"] Mar 19 16:10:00 crc kubenswrapper[4771]: E0319 16:10:00.179761 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d98595c-7966-42e7-b19c-ed21464e0c22" containerName="copy" Mar 19 16:10:00 crc kubenswrapper[4771]: I0319 16:10:00.179775 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d98595c-7966-42e7-b19c-ed21464e0c22" containerName="copy" Mar 19 16:10:00 crc kubenswrapper[4771]: E0319 16:10:00.179807 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d98595c-7966-42e7-b19c-ed21464e0c22" containerName="gather" Mar 19 16:10:00 crc kubenswrapper[4771]: I0319 16:10:00.179814 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d98595c-7966-42e7-b19c-ed21464e0c22" containerName="gather" Mar 19 16:10:00 crc kubenswrapper[4771]: E0319 16:10:00.179837 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="241d0317-b6ab-4a15-947b-09e42c97b972" containerName="oc" Mar 19 16:10:00 crc kubenswrapper[4771]: I0319 16:10:00.179847 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="241d0317-b6ab-4a15-947b-09e42c97b972" containerName="oc" Mar 19 16:10:00 crc kubenswrapper[4771]: I0319 16:10:00.180047 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d98595c-7966-42e7-b19c-ed21464e0c22" containerName="copy" Mar 19 16:10:00 crc kubenswrapper[4771]: I0319 16:10:00.180069 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="241d0317-b6ab-4a15-947b-09e42c97b972" containerName="oc" Mar 19 16:10:00 crc kubenswrapper[4771]: I0319 16:10:00.180085 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d98595c-7966-42e7-b19c-ed21464e0c22" containerName="gather" Mar 19 16:10:00 crc kubenswrapper[4771]: I0319 16:10:00.180698 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565610-kzlfb" Mar 19 16:10:00 crc kubenswrapper[4771]: I0319 16:10:00.182569 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k42k7" Mar 19 16:10:00 crc kubenswrapper[4771]: I0319 16:10:00.182797 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 16:10:00 crc kubenswrapper[4771]: I0319 16:10:00.183617 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 16:10:00 crc kubenswrapper[4771]: I0319 16:10:00.195259 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565610-kzlfb"] Mar 19 16:10:00 crc kubenswrapper[4771]: I0319 16:10:00.227666 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwhwl\" (UniqueName: \"kubernetes.io/projected/a08bef6a-7bf4-492c-b8c2-5a6c02de6ecc-kube-api-access-dwhwl\") pod \"auto-csr-approver-29565610-kzlfb\" (UID: \"a08bef6a-7bf4-492c-b8c2-5a6c02de6ecc\") " pod="openshift-infra/auto-csr-approver-29565610-kzlfb" Mar 19 16:10:00 crc kubenswrapper[4771]: I0319 16:10:00.330035 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwhwl\" (UniqueName: \"kubernetes.io/projected/a08bef6a-7bf4-492c-b8c2-5a6c02de6ecc-kube-api-access-dwhwl\") pod \"auto-csr-approver-29565610-kzlfb\" (UID: \"a08bef6a-7bf4-492c-b8c2-5a6c02de6ecc\") " pod="openshift-infra/auto-csr-approver-29565610-kzlfb" Mar 19 16:10:00 crc kubenswrapper[4771]: I0319 16:10:00.360033 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwhwl\" (UniqueName: \"kubernetes.io/projected/a08bef6a-7bf4-492c-b8c2-5a6c02de6ecc-kube-api-access-dwhwl\") pod \"auto-csr-approver-29565610-kzlfb\" (UID: \"a08bef6a-7bf4-492c-b8c2-5a6c02de6ecc\") " pod="openshift-infra/auto-csr-approver-29565610-kzlfb" Mar 19 16:10:00 crc kubenswrapper[4771]: I0319 16:10:00.547014 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565610-kzlfb" Mar 19 16:10:00 crc kubenswrapper[4771]: I0319 16:10:00.978412 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565610-kzlfb"] Mar 19 16:10:00 crc kubenswrapper[4771]: W0319 16:10:00.982109 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda08bef6a_7bf4_492c_b8c2_5a6c02de6ecc.slice/crio-7b58ba808f9e39da62530133fe01d27c40c7fab120b33e5c94fabf3c1abe485a WatchSource:0}: Error finding container 7b58ba808f9e39da62530133fe01d27c40c7fab120b33e5c94fabf3c1abe485a: Status 404 returned error can't find the container with id 7b58ba808f9e39da62530133fe01d27c40c7fab120b33e5c94fabf3c1abe485a Mar 19 16:10:01 crc kubenswrapper[4771]: I0319 16:10:01.756677 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565610-kzlfb" event={"ID":"a08bef6a-7bf4-492c-b8c2-5a6c02de6ecc","Type":"ContainerStarted","Data":"7b58ba808f9e39da62530133fe01d27c40c7fab120b33e5c94fabf3c1abe485a"} Mar 19 16:10:02 crc kubenswrapper[4771]: I0319 16:10:02.765229 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565610-kzlfb" event={"ID":"a08bef6a-7bf4-492c-b8c2-5a6c02de6ecc","Type":"ContainerStarted","Data":"e1be619cb879db6967e4722d19090c45d01bff5d06490f3002f9184bd74b5e59"} Mar 19 16:10:02 crc kubenswrapper[4771]: I0319 16:10:02.784558 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565610-kzlfb" podStartSLOduration=1.498812338 podStartE2EDuration="2.784544096s" podCreationTimestamp="2026-03-19 16:10:00 +0000 UTC" firstStartedPulling="2026-03-19 16:10:00.98366918 +0000 UTC m=+3260.212290382" lastFinishedPulling="2026-03-19 16:10:02.269400938 +0000 UTC m=+3261.498022140" observedRunningTime="2026-03-19 16:10:02.783521632 +0000 UTC m=+3262.012142834" watchObservedRunningTime="2026-03-19 16:10:02.784544096 +0000 UTC m=+3262.013165298" Mar 19 16:10:03 crc kubenswrapper[4771]: I0319 16:10:03.800567 4771 generic.go:334] "Generic (PLEG): container finished" podID="a08bef6a-7bf4-492c-b8c2-5a6c02de6ecc" containerID="e1be619cb879db6967e4722d19090c45d01bff5d06490f3002f9184bd74b5e59" exitCode=0 Mar 19 16:10:03 crc kubenswrapper[4771]: I0319 16:10:03.800637 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565610-kzlfb" event={"ID":"a08bef6a-7bf4-492c-b8c2-5a6c02de6ecc","Type":"ContainerDied","Data":"e1be619cb879db6967e4722d19090c45d01bff5d06490f3002f9184bd74b5e59"} Mar 19 16:10:05 crc kubenswrapper[4771]: I0319 16:10:05.180384 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565610-kzlfb" Mar 19 16:10:05 crc kubenswrapper[4771]: I0319 16:10:05.324232 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwhwl\" (UniqueName: \"kubernetes.io/projected/a08bef6a-7bf4-492c-b8c2-5a6c02de6ecc-kube-api-access-dwhwl\") pod \"a08bef6a-7bf4-492c-b8c2-5a6c02de6ecc\" (UID: \"a08bef6a-7bf4-492c-b8c2-5a6c02de6ecc\") " Mar 19 16:10:05 crc kubenswrapper[4771]: I0319 16:10:05.330228 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a08bef6a-7bf4-492c-b8c2-5a6c02de6ecc-kube-api-access-dwhwl" (OuterVolumeSpecName: "kube-api-access-dwhwl") pod "a08bef6a-7bf4-492c-b8c2-5a6c02de6ecc" (UID: "a08bef6a-7bf4-492c-b8c2-5a6c02de6ecc"). InnerVolumeSpecName "kube-api-access-dwhwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:10:05 crc kubenswrapper[4771]: I0319 16:10:05.425656 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwhwl\" (UniqueName: \"kubernetes.io/projected/a08bef6a-7bf4-492c-b8c2-5a6c02de6ecc-kube-api-access-dwhwl\") on node \"crc\" DevicePath \"\"" Mar 19 16:10:05 crc kubenswrapper[4771]: I0319 16:10:05.828734 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565610-kzlfb" event={"ID":"a08bef6a-7bf4-492c-b8c2-5a6c02de6ecc","Type":"ContainerDied","Data":"7b58ba808f9e39da62530133fe01d27c40c7fab120b33e5c94fabf3c1abe485a"} Mar 19 16:10:05 crc kubenswrapper[4771]: I0319 16:10:05.828816 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b58ba808f9e39da62530133fe01d27c40c7fab120b33e5c94fabf3c1abe485a" Mar 19 16:10:05 crc kubenswrapper[4771]: I0319 16:10:05.828856 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565610-kzlfb" Mar 19 16:10:05 crc kubenswrapper[4771]: I0319 16:10:05.908889 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565604-9jxxb"] Mar 19 16:10:05 crc kubenswrapper[4771]: I0319 16:10:05.920116 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565604-9jxxb"] Mar 19 16:10:07 crc kubenswrapper[4771]: I0319 16:10:07.508670 4771 scope.go:117] "RemoveContainer" containerID="968b4f95c0b111096835bcde7002c8b81ceeae3af6b39b2a7dae5f7403f2ce35" Mar 19 16:10:07 crc kubenswrapper[4771]: E0319 16:10:07.509543 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 16:10:07 crc kubenswrapper[4771]: I0319 16:10:07.520528 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6e2569e-154b-490c-8ed8-d0143bc882b8" path="/var/lib/kubelet/pods/e6e2569e-154b-490c-8ed8-d0143bc882b8/volumes" Mar 19 16:10:09 crc kubenswrapper[4771]: I0319 16:10:09.509027 4771 scope.go:117] "RemoveContainer" containerID="e5c5c66b506efa8694bfa43d4f858b13bff35ced8c256d88612b4333cab5de62" Mar 19 16:10:09 crc kubenswrapper[4771]: I0319 16:10:09.509334 4771 scope.go:117] "RemoveContainer" containerID="61f71d8f78872ae2069d97734c0786c282834569539d908fa0d4e9fff3931e7f" Mar 19 16:10:09 crc kubenswrapper[4771]: E0319 16:10:09.509563 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:10:09 crc kubenswrapper[4771]: E0319 16:10:09.509565 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:10:18 crc kubenswrapper[4771]: I0319 16:10:18.509774 4771 scope.go:117] "RemoveContainer" containerID="968b4f95c0b111096835bcde7002c8b81ceeae3af6b39b2a7dae5f7403f2ce35" Mar 19 16:10:18 crc kubenswrapper[4771]: E0319 16:10:18.510873 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wqbzp_openshift-machine-config-operator(f2b6e948-bbef-4217-b0eb-4cdbf711037c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" Mar 19 16:10:20 crc kubenswrapper[4771]: I0319 16:10:20.509407 4771 scope.go:117] "RemoveContainer" containerID="61f71d8f78872ae2069d97734c0786c282834569539d908fa0d4e9fff3931e7f" Mar 19 16:10:20 crc kubenswrapper[4771]: E0319 16:10:20.510266 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:10:24 crc kubenswrapper[4771]: I0319 16:10:24.509210 4771 scope.go:117] "RemoveContainer" containerID="e5c5c66b506efa8694bfa43d4f858b13bff35ced8c256d88612b4333cab5de62" Mar 19 16:10:24 crc kubenswrapper[4771]: E0319 16:10:24.510353 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:10:31 crc kubenswrapper[4771]: I0319 16:10:31.514530 4771 scope.go:117] "RemoveContainer" containerID="61f71d8f78872ae2069d97734c0786c282834569539d908fa0d4e9fff3931e7f" Mar 19 16:10:31 crc kubenswrapper[4771]: E0319 16:10:31.515329 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:10:31 crc kubenswrapper[4771]: I0319 16:10:31.515499 4771 scope.go:117] "RemoveContainer" containerID="968b4f95c0b111096835bcde7002c8b81ceeae3af6b39b2a7dae5f7403f2ce35" Mar 19 16:10:32 crc kubenswrapper[4771]: I0319 16:10:32.083950 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" event={"ID":"f2b6e948-bbef-4217-b0eb-4cdbf711037c","Type":"ContainerStarted","Data":"3d12d96211f152b0e53682157f5732902daa9200b8410145c09ffc8b4e6aea80"} Mar 19 16:10:37 crc kubenswrapper[4771]: I0319 16:10:37.509623 4771 scope.go:117] "RemoveContainer" containerID="e5c5c66b506efa8694bfa43d4f858b13bff35ced8c256d88612b4333cab5de62" Mar 19 16:10:37 crc kubenswrapper[4771]: E0319 16:10:37.510711 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:10:45 crc kubenswrapper[4771]: I0319 16:10:45.509924 4771 scope.go:117] "RemoveContainer" containerID="61f71d8f78872ae2069d97734c0786c282834569539d908fa0d4e9fff3931e7f" Mar 19 16:10:45 crc kubenswrapper[4771]: E0319 16:10:45.510897 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:10:51 crc kubenswrapper[4771]: I0319 16:10:51.515767 4771 scope.go:117] "RemoveContainer" containerID="e5c5c66b506efa8694bfa43d4f858b13bff35ced8c256d88612b4333cab5de62" Mar 19 16:10:51 crc kubenswrapper[4771]: E0319 16:10:51.516650 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:10:59 crc kubenswrapper[4771]: I0319 16:10:59.510543 4771 scope.go:117] "RemoveContainer" containerID="61f71d8f78872ae2069d97734c0786c282834569539d908fa0d4e9fff3931e7f" Mar 19 16:10:59 crc kubenswrapper[4771]: E0319 16:10:59.511466 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:10:59 crc kubenswrapper[4771]: I0319 16:10:59.809661 4771 scope.go:117] "RemoveContainer" containerID="3c1795755ffa0dd8dd316334b0614d70565fd4f845e356364a9691ff179500c3" Mar 19 16:11:06 crc kubenswrapper[4771]: I0319 16:11:06.509281 4771 scope.go:117] "RemoveContainer" containerID="e5c5c66b506efa8694bfa43d4f858b13bff35ced8c256d88612b4333cab5de62" Mar 19 16:11:06 crc kubenswrapper[4771]: E0319 16:11:06.510019 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:11:14 crc kubenswrapper[4771]: I0319 16:11:14.509341 4771 scope.go:117] "RemoveContainer" containerID="61f71d8f78872ae2069d97734c0786c282834569539d908fa0d4e9fff3931e7f" Mar 19 16:11:14 crc kubenswrapper[4771]: E0319 16:11:14.511406 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:11:18 crc kubenswrapper[4771]: I0319 16:11:18.509452 4771 scope.go:117] "RemoveContainer" containerID="e5c5c66b506efa8694bfa43d4f858b13bff35ced8c256d88612b4333cab5de62" Mar 19 16:11:18 crc kubenswrapper[4771]: E0319 16:11:18.509968 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:11:27 crc kubenswrapper[4771]: I0319 16:11:27.509402 4771 scope.go:117] "RemoveContainer" containerID="61f71d8f78872ae2069d97734c0786c282834569539d908fa0d4e9fff3931e7f" Mar 19 16:11:27 crc kubenswrapper[4771]: E0319 16:11:27.510436 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:11:33 crc kubenswrapper[4771]: I0319 16:11:33.509263 4771 scope.go:117] "RemoveContainer" containerID="e5c5c66b506efa8694bfa43d4f858b13bff35ced8c256d88612b4333cab5de62" Mar 19 16:11:33 crc kubenswrapper[4771]: E0319 16:11:33.510334 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:11:39 crc kubenswrapper[4771]: I0319 16:11:39.509289 4771 scope.go:117] "RemoveContainer" containerID="61f71d8f78872ae2069d97734c0786c282834569539d908fa0d4e9fff3931e7f" Mar 19 16:11:39 crc kubenswrapper[4771]: E0319 16:11:39.510894 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:11:48 crc kubenswrapper[4771]: I0319 16:11:48.509220 4771 scope.go:117] "RemoveContainer" containerID="e5c5c66b506efa8694bfa43d4f858b13bff35ced8c256d88612b4333cab5de62" Mar 19 16:11:48 crc kubenswrapper[4771]: E0319 16:11:48.511254 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:11:51 crc kubenswrapper[4771]: I0319 16:11:51.514503 4771 scope.go:117] "RemoveContainer" containerID="61f71d8f78872ae2069d97734c0786c282834569539d908fa0d4e9fff3931e7f" Mar 19 16:11:51 crc kubenswrapper[4771]: E0319 16:11:51.515019 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:11:59 crc kubenswrapper[4771]: I0319 16:11:59.509139 4771 scope.go:117] "RemoveContainer" containerID="e5c5c66b506efa8694bfa43d4f858b13bff35ced8c256d88612b4333cab5de62" Mar 19 16:11:59 crc kubenswrapper[4771]: E0319 16:11:59.509975 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:12:00 crc kubenswrapper[4771]: I0319 16:12:00.153867 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565612-g6fxm"] Mar 19 16:12:00 crc kubenswrapper[4771]: E0319 16:12:00.154719 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a08bef6a-7bf4-492c-b8c2-5a6c02de6ecc" containerName="oc" Mar 19 16:12:00 crc kubenswrapper[4771]: I0319 16:12:00.154754 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a08bef6a-7bf4-492c-b8c2-5a6c02de6ecc" containerName="oc" Mar 19 16:12:00 crc kubenswrapper[4771]: I0319 16:12:00.155066 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a08bef6a-7bf4-492c-b8c2-5a6c02de6ecc" containerName="oc" Mar 19 16:12:00 crc kubenswrapper[4771]: I0319 16:12:00.155681 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565612-g6fxm" Mar 19 16:12:00 crc kubenswrapper[4771]: I0319 16:12:00.157533 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k42k7" Mar 19 16:12:00 crc kubenswrapper[4771]: I0319 16:12:00.158922 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 16:12:00 crc kubenswrapper[4771]: I0319 16:12:00.171025 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565612-g6fxm"] Mar 19 16:12:00 crc kubenswrapper[4771]: I0319 16:12:00.213055 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 16:12:00 crc kubenswrapper[4771]: I0319 16:12:00.227384 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4lfs\" (UniqueName: \"kubernetes.io/projected/653e0ae9-bcb5-4fe4-9f41-fb0d76f2d308-kube-api-access-s4lfs\") pod \"auto-csr-approver-29565612-g6fxm\" (UID: \"653e0ae9-bcb5-4fe4-9f41-fb0d76f2d308\") " pod="openshift-infra/auto-csr-approver-29565612-g6fxm" Mar 19 16:12:00 crc kubenswrapper[4771]: I0319 16:12:00.329658 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4lfs\" (UniqueName: \"kubernetes.io/projected/653e0ae9-bcb5-4fe4-9f41-fb0d76f2d308-kube-api-access-s4lfs\") pod \"auto-csr-approver-29565612-g6fxm\" (UID: \"653e0ae9-bcb5-4fe4-9f41-fb0d76f2d308\") " pod="openshift-infra/auto-csr-approver-29565612-g6fxm" Mar 19 16:12:00 crc kubenswrapper[4771]: I0319 16:12:00.347489 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4lfs\" (UniqueName: \"kubernetes.io/projected/653e0ae9-bcb5-4fe4-9f41-fb0d76f2d308-kube-api-access-s4lfs\") pod \"auto-csr-approver-29565612-g6fxm\" (UID: \"653e0ae9-bcb5-4fe4-9f41-fb0d76f2d308\") " pod="openshift-infra/auto-csr-approver-29565612-g6fxm" Mar 19 16:12:00 crc kubenswrapper[4771]: I0319 16:12:00.527981 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565612-g6fxm" Mar 19 16:12:00 crc kubenswrapper[4771]: I0319 16:12:00.977476 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565612-g6fxm"] Mar 19 16:12:00 crc kubenswrapper[4771]: W0319 16:12:00.983629 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod653e0ae9_bcb5_4fe4_9f41_fb0d76f2d308.slice/crio-bbe8e2da34ba76d4a677f3bb71761082573bf5166f92016355239f37b78e351b WatchSource:0}: Error finding container bbe8e2da34ba76d4a677f3bb71761082573bf5166f92016355239f37b78e351b: Status 404 returned error can't find the container with id bbe8e2da34ba76d4a677f3bb71761082573bf5166f92016355239f37b78e351b Mar 19 16:12:01 crc kubenswrapper[4771]: I0319 16:12:01.910164 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565612-g6fxm" event={"ID":"653e0ae9-bcb5-4fe4-9f41-fb0d76f2d308","Type":"ContainerStarted","Data":"bbe8e2da34ba76d4a677f3bb71761082573bf5166f92016355239f37b78e351b"} Mar 19 16:12:02 crc kubenswrapper[4771]: I0319 16:12:02.920882 4771 generic.go:334] "Generic (PLEG): container finished" podID="653e0ae9-bcb5-4fe4-9f41-fb0d76f2d308" containerID="df346ef3a340d084f26c242c4f208625401351a7a5030975c2753a776e59ec7c" exitCode=0 Mar 19 16:12:02 crc kubenswrapper[4771]: I0319 16:12:02.920950 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565612-g6fxm" event={"ID":"653e0ae9-bcb5-4fe4-9f41-fb0d76f2d308","Type":"ContainerDied","Data":"df346ef3a340d084f26c242c4f208625401351a7a5030975c2753a776e59ec7c"} Mar 19 16:12:04 crc kubenswrapper[4771]: I0319 16:12:04.249403 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565612-g6fxm" Mar 19 16:12:04 crc kubenswrapper[4771]: I0319 16:12:04.406750 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4lfs\" (UniqueName: \"kubernetes.io/projected/653e0ae9-bcb5-4fe4-9f41-fb0d76f2d308-kube-api-access-s4lfs\") pod \"653e0ae9-bcb5-4fe4-9f41-fb0d76f2d308\" (UID: \"653e0ae9-bcb5-4fe4-9f41-fb0d76f2d308\") " Mar 19 16:12:04 crc kubenswrapper[4771]: I0319 16:12:04.413476 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/653e0ae9-bcb5-4fe4-9f41-fb0d76f2d308-kube-api-access-s4lfs" (OuterVolumeSpecName: "kube-api-access-s4lfs") pod "653e0ae9-bcb5-4fe4-9f41-fb0d76f2d308" (UID: "653e0ae9-bcb5-4fe4-9f41-fb0d76f2d308"). InnerVolumeSpecName "kube-api-access-s4lfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:12:04 crc kubenswrapper[4771]: I0319 16:12:04.510597 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4lfs\" (UniqueName: \"kubernetes.io/projected/653e0ae9-bcb5-4fe4-9f41-fb0d76f2d308-kube-api-access-s4lfs\") on node \"crc\" DevicePath \"\"" Mar 19 16:12:04 crc kubenswrapper[4771]: I0319 16:12:04.944817 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565612-g6fxm" event={"ID":"653e0ae9-bcb5-4fe4-9f41-fb0d76f2d308","Type":"ContainerDied","Data":"bbe8e2da34ba76d4a677f3bb71761082573bf5166f92016355239f37b78e351b"} Mar 19 16:12:04 crc kubenswrapper[4771]: I0319 16:12:04.944875 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbe8e2da34ba76d4a677f3bb71761082573bf5166f92016355239f37b78e351b" Mar 19 16:12:04 crc kubenswrapper[4771]: I0319 16:12:04.944897 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565612-g6fxm" Mar 19 16:12:05 crc kubenswrapper[4771]: E0319 16:12:05.052501 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod653e0ae9_bcb5_4fe4_9f41_fb0d76f2d308.slice\": RecentStats: unable to find data in memory cache]" Mar 19 16:12:05 crc kubenswrapper[4771]: I0319 16:12:05.336962 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565606-rpgg5"] Mar 19 16:12:05 crc kubenswrapper[4771]: I0319 16:12:05.349211 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565606-rpgg5"] Mar 19 16:12:05 crc kubenswrapper[4771]: I0319 16:12:05.509383 4771 scope.go:117] "RemoveContainer" containerID="61f71d8f78872ae2069d97734c0786c282834569539d908fa0d4e9fff3931e7f" Mar 19 16:12:05 crc kubenswrapper[4771]: E0319 16:12:05.510051 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:12:05 crc kubenswrapper[4771]: I0319 16:12:05.520234 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef84e2f9-ca97-4b5e-adaa-c8d337e7b0f1" path="/var/lib/kubelet/pods/ef84e2f9-ca97-4b5e-adaa-c8d337e7b0f1/volumes" Mar 19 16:12:10 crc kubenswrapper[4771]: I0319 16:12:10.509372 4771 scope.go:117] "RemoveContainer" containerID="e5c5c66b506efa8694bfa43d4f858b13bff35ced8c256d88612b4333cab5de62" Mar 19 16:12:10 crc kubenswrapper[4771]: E0319 16:12:10.511081 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:12:16 crc kubenswrapper[4771]: I0319 16:12:16.509720 4771 scope.go:117] "RemoveContainer" containerID="61f71d8f78872ae2069d97734c0786c282834569539d908fa0d4e9fff3931e7f" Mar 19 16:12:16 crc kubenswrapper[4771]: E0319 16:12:16.510903 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:12:22 crc kubenswrapper[4771]: I0319 16:12:22.509801 4771 scope.go:117] "RemoveContainer" containerID="e5c5c66b506efa8694bfa43d4f858b13bff35ced8c256d88612b4333cab5de62" Mar 19 16:12:22 crc kubenswrapper[4771]: E0319 16:12:22.511443 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:12:31 crc kubenswrapper[4771]: I0319 16:12:31.521512 4771 scope.go:117] "RemoveContainer" containerID="61f71d8f78872ae2069d97734c0786c282834569539d908fa0d4e9fff3931e7f" Mar 19 16:12:31 crc kubenswrapper[4771]: E0319 16:12:31.522570 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:12:36 crc kubenswrapper[4771]: I0319 16:12:36.509023 4771 scope.go:117] "RemoveContainer" containerID="e5c5c66b506efa8694bfa43d4f858b13bff35ced8c256d88612b4333cab5de62" Mar 19 16:12:36 crc kubenswrapper[4771]: E0319 16:12:36.510016 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:12:44 crc kubenswrapper[4771]: I0319 16:12:44.865010 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wv8mm"] Mar 19 16:12:44 crc kubenswrapper[4771]: E0319 16:12:44.865900 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="653e0ae9-bcb5-4fe4-9f41-fb0d76f2d308" containerName="oc" Mar 19 16:12:44 crc kubenswrapper[4771]: I0319 16:12:44.865912 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="653e0ae9-bcb5-4fe4-9f41-fb0d76f2d308" containerName="oc" Mar 19 16:12:44 crc kubenswrapper[4771]: I0319 16:12:44.866119 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="653e0ae9-bcb5-4fe4-9f41-fb0d76f2d308" containerName="oc" Mar 19 16:12:44 crc kubenswrapper[4771]: I0319 16:12:44.867182 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wv8mm" Mar 19 16:12:44 crc kubenswrapper[4771]: I0319 16:12:44.899068 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wv8mm"] Mar 19 16:12:44 crc kubenswrapper[4771]: I0319 16:12:44.945475 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f60a9f3-be7c-4486-a340-e9253db6d388-catalog-content\") pod \"redhat-marketplace-wv8mm\" (UID: \"0f60a9f3-be7c-4486-a340-e9253db6d388\") " pod="openshift-marketplace/redhat-marketplace-wv8mm" Mar 19 16:12:44 crc kubenswrapper[4771]: I0319 16:12:44.945615 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8296\" (UniqueName: \"kubernetes.io/projected/0f60a9f3-be7c-4486-a340-e9253db6d388-kube-api-access-t8296\") pod \"redhat-marketplace-wv8mm\" (UID: \"0f60a9f3-be7c-4486-a340-e9253db6d388\") " pod="openshift-marketplace/redhat-marketplace-wv8mm" Mar 19 16:12:44 crc kubenswrapper[4771]: I0319 16:12:44.945666 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f60a9f3-be7c-4486-a340-e9253db6d388-utilities\") pod \"redhat-marketplace-wv8mm\" (UID: \"0f60a9f3-be7c-4486-a340-e9253db6d388\") " pod="openshift-marketplace/redhat-marketplace-wv8mm" Mar 19 16:12:45 crc kubenswrapper[4771]: I0319 16:12:45.047531 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f60a9f3-be7c-4486-a340-e9253db6d388-utilities\") pod \"redhat-marketplace-wv8mm\" (UID: \"0f60a9f3-be7c-4486-a340-e9253db6d388\") " pod="openshift-marketplace/redhat-marketplace-wv8mm" Mar 19 16:12:45 crc kubenswrapper[4771]: I0319 16:12:45.047921 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f60a9f3-be7c-4486-a340-e9253db6d388-catalog-content\") pod \"redhat-marketplace-wv8mm\" (UID: \"0f60a9f3-be7c-4486-a340-e9253db6d388\") " pod="openshift-marketplace/redhat-marketplace-wv8mm" Mar 19 16:12:45 crc kubenswrapper[4771]: I0319 16:12:45.048040 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f60a9f3-be7c-4486-a340-e9253db6d388-utilities\") pod \"redhat-marketplace-wv8mm\" (UID: \"0f60a9f3-be7c-4486-a340-e9253db6d388\") " pod="openshift-marketplace/redhat-marketplace-wv8mm" Mar 19 16:12:45 crc kubenswrapper[4771]: I0319 16:12:45.048056 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8296\" (UniqueName: \"kubernetes.io/projected/0f60a9f3-be7c-4486-a340-e9253db6d388-kube-api-access-t8296\") pod \"redhat-marketplace-wv8mm\" (UID: \"0f60a9f3-be7c-4486-a340-e9253db6d388\") " pod="openshift-marketplace/redhat-marketplace-wv8mm" Mar 19 16:12:45 crc kubenswrapper[4771]: I0319 16:12:45.048486 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f60a9f3-be7c-4486-a340-e9253db6d388-catalog-content\") pod \"redhat-marketplace-wv8mm\" (UID: \"0f60a9f3-be7c-4486-a340-e9253db6d388\") " pod="openshift-marketplace/redhat-marketplace-wv8mm" Mar 19 16:12:45 crc kubenswrapper[4771]: I0319 16:12:45.072352 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8296\" (UniqueName: \"kubernetes.io/projected/0f60a9f3-be7c-4486-a340-e9253db6d388-kube-api-access-t8296\") pod \"redhat-marketplace-wv8mm\" (UID: \"0f60a9f3-be7c-4486-a340-e9253db6d388\") " pod="openshift-marketplace/redhat-marketplace-wv8mm" Mar 19 16:12:45 crc kubenswrapper[4771]: I0319 16:12:45.186294 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wv8mm" Mar 19 16:12:45 crc kubenswrapper[4771]: I0319 16:12:45.415798 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wv8mm"] Mar 19 16:12:45 crc kubenswrapper[4771]: W0319 16:12:45.423372 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f60a9f3_be7c_4486_a340_e9253db6d388.slice/crio-6782e993b1af71d3d78dab64c01f8eb7afb9938fd57b2dbf4d93e5e6ada0a7f7 WatchSource:0}: Error finding container 6782e993b1af71d3d78dab64c01f8eb7afb9938fd57b2dbf4d93e5e6ada0a7f7: Status 404 returned error can't find the container with id 6782e993b1af71d3d78dab64c01f8eb7afb9938fd57b2dbf4d93e5e6ada0a7f7 Mar 19 16:12:46 crc kubenswrapper[4771]: I0319 16:12:46.330390 4771 generic.go:334] "Generic (PLEG): container finished" podID="0f60a9f3-be7c-4486-a340-e9253db6d388" containerID="e757ac37bf4616a0bf852cb859152bdca9413df419290fc722b931fb991a7751" exitCode=0 Mar 19 16:12:46 crc kubenswrapper[4771]: I0319 16:12:46.330804 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wv8mm" event={"ID":"0f60a9f3-be7c-4486-a340-e9253db6d388","Type":"ContainerDied","Data":"e757ac37bf4616a0bf852cb859152bdca9413df419290fc722b931fb991a7751"} Mar 19 16:12:46 crc kubenswrapper[4771]: I0319 16:12:46.330836 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wv8mm" event={"ID":"0f60a9f3-be7c-4486-a340-e9253db6d388","Type":"ContainerStarted","Data":"6782e993b1af71d3d78dab64c01f8eb7afb9938fd57b2dbf4d93e5e6ada0a7f7"} Mar 19 16:12:46 crc kubenswrapper[4771]: I0319 16:12:46.508452 4771 scope.go:117] "RemoveContainer" containerID="61f71d8f78872ae2069d97734c0786c282834569539d908fa0d4e9fff3931e7f" Mar 19 16:12:46 crc kubenswrapper[4771]: E0319 16:12:46.508898 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:12:48 crc kubenswrapper[4771]: I0319 16:12:48.347283 4771 generic.go:334] "Generic (PLEG): container finished" podID="0f60a9f3-be7c-4486-a340-e9253db6d388" containerID="c1295e2f012700a211e3e6d738b711c0b47bdd054e20df1e2deb6cf52c592d78" exitCode=0 Mar 19 16:12:48 crc kubenswrapper[4771]: I0319 16:12:48.347805 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wv8mm" event={"ID":"0f60a9f3-be7c-4486-a340-e9253db6d388","Type":"ContainerDied","Data":"c1295e2f012700a211e3e6d738b711c0b47bdd054e20df1e2deb6cf52c592d78"} Mar 19 16:12:49 crc kubenswrapper[4771]: I0319 16:12:49.357499 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wv8mm" event={"ID":"0f60a9f3-be7c-4486-a340-e9253db6d388","Type":"ContainerStarted","Data":"8c276690e9cbb82b61f78d870126df64c23a88307729a654678a4f28cd5dd769"} Mar 19 16:12:49 crc kubenswrapper[4771]: I0319 16:12:49.509370 4771 scope.go:117] "RemoveContainer" containerID="e5c5c66b506efa8694bfa43d4f858b13bff35ced8c256d88612b4333cab5de62" Mar 19 16:12:50 crc kubenswrapper[4771]: I0319 16:12:50.370255 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c065c328-37e2-4905-9d1e-82208eab196e","Type":"ContainerStarted","Data":"b54a60c0f63be2ca9eac07f68c3758137116bbdbe49c53d893b968fb5d59e113"} Mar 19 16:12:50 crc kubenswrapper[4771]: I0319 16:12:50.370860 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:12:50 crc kubenswrapper[4771]: I0319 16:12:50.394449 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wv8mm" podStartSLOduration=3.8133731060000002 podStartE2EDuration="6.394425336s" podCreationTimestamp="2026-03-19 16:12:44 +0000 UTC" firstStartedPulling="2026-03-19 16:12:46.332599046 +0000 UTC m=+3425.561220258" lastFinishedPulling="2026-03-19 16:12:48.913651256 +0000 UTC m=+3428.142272488" observedRunningTime="2026-03-19 16:12:49.381248406 +0000 UTC m=+3428.609869608" watchObservedRunningTime="2026-03-19 16:12:50.394425336 +0000 UTC m=+3429.623046578" Mar 19 16:12:53 crc kubenswrapper[4771]: I0319 16:12:53.027246 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 16:12:53 crc kubenswrapper[4771]: I0319 16:12:53.027583 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 16:12:54 crc kubenswrapper[4771]: I0319 16:12:54.405686 4771 generic.go:334] "Generic (PLEG): container finished" podID="c065c328-37e2-4905-9d1e-82208eab196e" containerID="b54a60c0f63be2ca9eac07f68c3758137116bbdbe49c53d893b968fb5d59e113" exitCode=0 Mar 19 16:12:54 crc kubenswrapper[4771]: I0319 16:12:54.405806 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c065c328-37e2-4905-9d1e-82208eab196e","Type":"ContainerDied","Data":"b54a60c0f63be2ca9eac07f68c3758137116bbdbe49c53d893b968fb5d59e113"} Mar 19 16:12:54 crc kubenswrapper[4771]: I0319 16:12:54.406026 4771 scope.go:117] "RemoveContainer" containerID="e5c5c66b506efa8694bfa43d4f858b13bff35ced8c256d88612b4333cab5de62" Mar 19 16:12:54 crc kubenswrapper[4771]: I0319 16:12:54.406926 4771 scope.go:117] "RemoveContainer" containerID="b54a60c0f63be2ca9eac07f68c3758137116bbdbe49c53d893b968fb5d59e113" Mar 19 16:12:54 crc kubenswrapper[4771]: E0319 16:12:54.407600 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:12:55 crc kubenswrapper[4771]: I0319 16:12:55.186826 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wv8mm" Mar 19 16:12:55 crc kubenswrapper[4771]: I0319 16:12:55.186919 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wv8mm" Mar 19 16:12:55 crc kubenswrapper[4771]: I0319 16:12:55.247089 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wv8mm" Mar 19 16:12:55 crc kubenswrapper[4771]: I0319 16:12:55.472289 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wv8mm" Mar 19 16:12:55 crc kubenswrapper[4771]: I0319 16:12:55.520681 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wv8mm"] Mar 19 16:12:57 crc kubenswrapper[4771]: I0319 16:12:57.438185 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wv8mm" podUID="0f60a9f3-be7c-4486-a340-e9253db6d388" containerName="registry-server" containerID="cri-o://8c276690e9cbb82b61f78d870126df64c23a88307729a654678a4f28cd5dd769" gracePeriod=2 Mar 19 16:12:57 crc kubenswrapper[4771]: I0319 16:12:57.920930 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wv8mm" Mar 19 16:12:57 crc kubenswrapper[4771]: I0319 16:12:57.967900 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f60a9f3-be7c-4486-a340-e9253db6d388-utilities\") pod \"0f60a9f3-be7c-4486-a340-e9253db6d388\" (UID: \"0f60a9f3-be7c-4486-a340-e9253db6d388\") " Mar 19 16:12:57 crc kubenswrapper[4771]: I0319 16:12:57.967965 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f60a9f3-be7c-4486-a340-e9253db6d388-catalog-content\") pod \"0f60a9f3-be7c-4486-a340-e9253db6d388\" (UID: \"0f60a9f3-be7c-4486-a340-e9253db6d388\") " Mar 19 16:12:57 crc kubenswrapper[4771]: I0319 16:12:57.968261 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8296\" (UniqueName: \"kubernetes.io/projected/0f60a9f3-be7c-4486-a340-e9253db6d388-kube-api-access-t8296\") pod \"0f60a9f3-be7c-4486-a340-e9253db6d388\" (UID: \"0f60a9f3-be7c-4486-a340-e9253db6d388\") " Mar 19 16:12:57 crc kubenswrapper[4771]: I0319 16:12:57.970651 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f60a9f3-be7c-4486-a340-e9253db6d388-utilities" (OuterVolumeSpecName: "utilities") pod "0f60a9f3-be7c-4486-a340-e9253db6d388" (UID: "0f60a9f3-be7c-4486-a340-e9253db6d388"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:12:57 crc kubenswrapper[4771]: I0319 16:12:57.974814 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f60a9f3-be7c-4486-a340-e9253db6d388-kube-api-access-t8296" (OuterVolumeSpecName: "kube-api-access-t8296") pod "0f60a9f3-be7c-4486-a340-e9253db6d388" (UID: "0f60a9f3-be7c-4486-a340-e9253db6d388"). InnerVolumeSpecName "kube-api-access-t8296". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:12:57 crc kubenswrapper[4771]: I0319 16:12:57.999287 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f60a9f3-be7c-4486-a340-e9253db6d388-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f60a9f3-be7c-4486-a340-e9253db6d388" (UID: "0f60a9f3-be7c-4486-a340-e9253db6d388"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:12:58 crc kubenswrapper[4771]: I0319 16:12:58.070392 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8296\" (UniqueName: \"kubernetes.io/projected/0f60a9f3-be7c-4486-a340-e9253db6d388-kube-api-access-t8296\") on node \"crc\" DevicePath \"\"" Mar 19 16:12:58 crc kubenswrapper[4771]: I0319 16:12:58.070423 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f60a9f3-be7c-4486-a340-e9253db6d388-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 16:12:58 crc kubenswrapper[4771]: I0319 16:12:58.070434 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f60a9f3-be7c-4486-a340-e9253db6d388-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 16:12:58 crc kubenswrapper[4771]: I0319 16:12:58.450722 4771 generic.go:334] "Generic (PLEG): container finished" podID="0f60a9f3-be7c-4486-a340-e9253db6d388" containerID="8c276690e9cbb82b61f78d870126df64c23a88307729a654678a4f28cd5dd769" exitCode=0 Mar 19 16:12:58 crc kubenswrapper[4771]: I0319 16:12:58.450782 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wv8mm" event={"ID":"0f60a9f3-be7c-4486-a340-e9253db6d388","Type":"ContainerDied","Data":"8c276690e9cbb82b61f78d870126df64c23a88307729a654678a4f28cd5dd769"} Mar 19 16:12:58 crc kubenswrapper[4771]: I0319 16:12:58.450845 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wv8mm" Mar 19 16:12:58 crc kubenswrapper[4771]: I0319 16:12:58.450886 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wv8mm" event={"ID":"0f60a9f3-be7c-4486-a340-e9253db6d388","Type":"ContainerDied","Data":"6782e993b1af71d3d78dab64c01f8eb7afb9938fd57b2dbf4d93e5e6ada0a7f7"} Mar 19 16:12:58 crc kubenswrapper[4771]: I0319 16:12:58.450921 4771 scope.go:117] "RemoveContainer" containerID="8c276690e9cbb82b61f78d870126df64c23a88307729a654678a4f28cd5dd769" Mar 19 16:12:58 crc kubenswrapper[4771]: I0319 16:12:58.489664 4771 scope.go:117] "RemoveContainer" containerID="c1295e2f012700a211e3e6d738b711c0b47bdd054e20df1e2deb6cf52c592d78" Mar 19 16:12:58 crc kubenswrapper[4771]: I0319 16:12:58.518746 4771 scope.go:117] "RemoveContainer" containerID="e757ac37bf4616a0bf852cb859152bdca9413df419290fc722b931fb991a7751" Mar 19 16:12:58 crc kubenswrapper[4771]: I0319 16:12:58.533774 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wv8mm"] Mar 19 16:12:58 crc kubenswrapper[4771]: I0319 16:12:58.550334 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wv8mm"] Mar 19 16:12:58 crc kubenswrapper[4771]: I0319 16:12:58.572456 4771 scope.go:117] "RemoveContainer" containerID="8c276690e9cbb82b61f78d870126df64c23a88307729a654678a4f28cd5dd769" Mar 19 16:12:58 crc kubenswrapper[4771]: E0319 16:12:58.572856 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c276690e9cbb82b61f78d870126df64c23a88307729a654678a4f28cd5dd769\": container with ID starting with 8c276690e9cbb82b61f78d870126df64c23a88307729a654678a4f28cd5dd769 not found: ID does not exist" containerID="8c276690e9cbb82b61f78d870126df64c23a88307729a654678a4f28cd5dd769" Mar 19 16:12:58 crc kubenswrapper[4771]: I0319 16:12:58.572890 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c276690e9cbb82b61f78d870126df64c23a88307729a654678a4f28cd5dd769"} err="failed to get container status \"8c276690e9cbb82b61f78d870126df64c23a88307729a654678a4f28cd5dd769\": rpc error: code = NotFound desc = could not find container \"8c276690e9cbb82b61f78d870126df64c23a88307729a654678a4f28cd5dd769\": container with ID starting with 8c276690e9cbb82b61f78d870126df64c23a88307729a654678a4f28cd5dd769 not found: ID does not exist" Mar 19 16:12:58 crc kubenswrapper[4771]: I0319 16:12:58.572913 4771 scope.go:117] "RemoveContainer" containerID="c1295e2f012700a211e3e6d738b711c0b47bdd054e20df1e2deb6cf52c592d78" Mar 19 16:12:58 crc kubenswrapper[4771]: E0319 16:12:58.573547 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1295e2f012700a211e3e6d738b711c0b47bdd054e20df1e2deb6cf52c592d78\": container with ID starting with c1295e2f012700a211e3e6d738b711c0b47bdd054e20df1e2deb6cf52c592d78 not found: ID does not exist" containerID="c1295e2f012700a211e3e6d738b711c0b47bdd054e20df1e2deb6cf52c592d78" Mar 19 16:12:58 crc kubenswrapper[4771]: I0319 16:12:58.573623 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1295e2f012700a211e3e6d738b711c0b47bdd054e20df1e2deb6cf52c592d78"} err="failed to get container status \"c1295e2f012700a211e3e6d738b711c0b47bdd054e20df1e2deb6cf52c592d78\": rpc error: code = NotFound desc = could not find container \"c1295e2f012700a211e3e6d738b711c0b47bdd054e20df1e2deb6cf52c592d78\": container with ID starting with c1295e2f012700a211e3e6d738b711c0b47bdd054e20df1e2deb6cf52c592d78 not found: ID does not exist" Mar 19 16:12:58 crc kubenswrapper[4771]: I0319 16:12:58.573657 4771 scope.go:117] "RemoveContainer" containerID="e757ac37bf4616a0bf852cb859152bdca9413df419290fc722b931fb991a7751" Mar 19 16:12:58 crc kubenswrapper[4771]: E0319 16:12:58.574301 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e757ac37bf4616a0bf852cb859152bdca9413df419290fc722b931fb991a7751\": container with ID starting with e757ac37bf4616a0bf852cb859152bdca9413df419290fc722b931fb991a7751 not found: ID does not exist" containerID="e757ac37bf4616a0bf852cb859152bdca9413df419290fc722b931fb991a7751" Mar 19 16:12:58 crc kubenswrapper[4771]: I0319 16:12:58.574369 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e757ac37bf4616a0bf852cb859152bdca9413df419290fc722b931fb991a7751"} err="failed to get container status \"e757ac37bf4616a0bf852cb859152bdca9413df419290fc722b931fb991a7751\": rpc error: code = NotFound desc = could not find container \"e757ac37bf4616a0bf852cb859152bdca9413df419290fc722b931fb991a7751\": container with ID starting with e757ac37bf4616a0bf852cb859152bdca9413df419290fc722b931fb991a7751 not found: ID does not exist" Mar 19 16:12:59 crc kubenswrapper[4771]: I0319 16:12:59.528823 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f60a9f3-be7c-4486-a340-e9253db6d388" path="/var/lib/kubelet/pods/0f60a9f3-be7c-4486-a340-e9253db6d388/volumes" Mar 19 16:12:59 crc kubenswrapper[4771]: I0319 16:12:59.930496 4771 scope.go:117] "RemoveContainer" containerID="76b846dee7526d06a452ddead1528d6db39bb5945e1441b8b6aac9a5efb7a701" Mar 19 16:13:01 crc kubenswrapper[4771]: I0319 16:13:01.517755 4771 scope.go:117] "RemoveContainer" containerID="61f71d8f78872ae2069d97734c0786c282834569539d908fa0d4e9fff3931e7f" Mar 19 16:13:02 crc kubenswrapper[4771]: I0319 16:13:02.494418 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74c5f622-0ced-47f9-80d5-75a09acfafc0","Type":"ContainerStarted","Data":"18b3dade220056fdd9e6f166aea4cbef4706d95a1a7ea0ecba6c0aa556af9980"} Mar 19 16:13:02 crc kubenswrapper[4771]: I0319 16:13:02.494957 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 19 16:13:05 crc kubenswrapper[4771]: I0319 16:13:05.509473 4771 scope.go:117] "RemoveContainer" containerID="b54a60c0f63be2ca9eac07f68c3758137116bbdbe49c53d893b968fb5d59e113" Mar 19 16:13:05 crc kubenswrapper[4771]: E0319 16:13:05.509739 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:13:06 crc kubenswrapper[4771]: I0319 16:13:06.537461 4771 generic.go:334] "Generic (PLEG): container finished" podID="74c5f622-0ced-47f9-80d5-75a09acfafc0" containerID="18b3dade220056fdd9e6f166aea4cbef4706d95a1a7ea0ecba6c0aa556af9980" exitCode=0 Mar 19 16:13:06 crc kubenswrapper[4771]: I0319 16:13:06.537545 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74c5f622-0ced-47f9-80d5-75a09acfafc0","Type":"ContainerDied","Data":"18b3dade220056fdd9e6f166aea4cbef4706d95a1a7ea0ecba6c0aa556af9980"} Mar 19 16:13:06 crc kubenswrapper[4771]: I0319 16:13:06.539497 4771 scope.go:117] "RemoveContainer" containerID="61f71d8f78872ae2069d97734c0786c282834569539d908fa0d4e9fff3931e7f" Mar 19 16:13:06 crc kubenswrapper[4771]: I0319 16:13:06.540736 4771 scope.go:117] "RemoveContainer" containerID="18b3dade220056fdd9e6f166aea4cbef4706d95a1a7ea0ecba6c0aa556af9980" Mar 19 16:13:06 crc kubenswrapper[4771]: E0319 16:13:06.541022 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:13:16 crc kubenswrapper[4771]: I0319 16:13:16.508347 4771 scope.go:117] "RemoveContainer" containerID="b54a60c0f63be2ca9eac07f68c3758137116bbdbe49c53d893b968fb5d59e113" Mar 19 16:13:16 crc kubenswrapper[4771]: E0319 16:13:16.509511 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:13:20 crc kubenswrapper[4771]: I0319 16:13:20.508373 4771 scope.go:117] "RemoveContainer" containerID="18b3dade220056fdd9e6f166aea4cbef4706d95a1a7ea0ecba6c0aa556af9980" Mar 19 16:13:20 crc kubenswrapper[4771]: E0319 16:13:20.508918 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:13:23 crc kubenswrapper[4771]: I0319 16:13:23.027945 4771 patch_prober.go:28] interesting pod/machine-config-daemon-wqbzp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 16:13:23 crc kubenswrapper[4771]: I0319 16:13:23.028300 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wqbzp" podUID="f2b6e948-bbef-4217-b0eb-4cdbf711037c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 16:13:29 crc kubenswrapper[4771]: I0319 16:13:29.509520 4771 scope.go:117] "RemoveContainer" containerID="b54a60c0f63be2ca9eac07f68c3758137116bbdbe49c53d893b968fb5d59e113" Mar 19 16:13:29 crc kubenswrapper[4771]: E0319 16:13:29.511019 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:13:32 crc kubenswrapper[4771]: I0319 16:13:32.509433 4771 scope.go:117] "RemoveContainer" containerID="18b3dade220056fdd9e6f166aea4cbef4706d95a1a7ea0ecba6c0aa556af9980" Mar 19 16:13:32 crc kubenswrapper[4771]: E0319 16:13:32.510408 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:13:40 crc kubenswrapper[4771]: I0319 16:13:40.510678 4771 scope.go:117] "RemoveContainer" containerID="b54a60c0f63be2ca9eac07f68c3758137116bbdbe49c53d893b968fb5d59e113" Mar 19 16:13:40 crc kubenswrapper[4771]: E0319 16:13:40.511364 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(c065c328-37e2-4905-9d1e-82208eab196e)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c065c328-37e2-4905-9d1e-82208eab196e" Mar 19 16:13:44 crc kubenswrapper[4771]: I0319 16:13:44.433963 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h46sz"] Mar 19 16:13:44 crc kubenswrapper[4771]: E0319 16:13:44.435118 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f60a9f3-be7c-4486-a340-e9253db6d388" containerName="registry-server" Mar 19 16:13:44 crc kubenswrapper[4771]: I0319 16:13:44.435140 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f60a9f3-be7c-4486-a340-e9253db6d388" containerName="registry-server" Mar 19 16:13:44 crc kubenswrapper[4771]: E0319 16:13:44.435178 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f60a9f3-be7c-4486-a340-e9253db6d388" containerName="extract-content" Mar 19 16:13:44 crc kubenswrapper[4771]: I0319 16:13:44.435189 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f60a9f3-be7c-4486-a340-e9253db6d388" containerName="extract-content" Mar 19 16:13:44 crc kubenswrapper[4771]: E0319 16:13:44.435220 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f60a9f3-be7c-4486-a340-e9253db6d388" containerName="extract-utilities" Mar 19 16:13:44 crc kubenswrapper[4771]: I0319 16:13:44.435231 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f60a9f3-be7c-4486-a340-e9253db6d388" containerName="extract-utilities" Mar 19 16:13:44 crc kubenswrapper[4771]: I0319 16:13:44.435500 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f60a9f3-be7c-4486-a340-e9253db6d388" containerName="registry-server" Mar 19 16:13:44 crc kubenswrapper[4771]: I0319 16:13:44.437366 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h46sz" Mar 19 16:13:44 crc kubenswrapper[4771]: I0319 16:13:44.482594 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h46sz"] Mar 19 16:13:44 crc kubenswrapper[4771]: I0319 16:13:44.520861 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fa1e7d7-4479-4381-8c9e-80b3d0c95d0f-catalog-content\") pod \"community-operators-h46sz\" (UID: \"2fa1e7d7-4479-4381-8c9e-80b3d0c95d0f\") " pod="openshift-marketplace/community-operators-h46sz" Mar 19 16:13:44 crc kubenswrapper[4771]: I0319 16:13:44.520944 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbxlr\" (UniqueName: \"kubernetes.io/projected/2fa1e7d7-4479-4381-8c9e-80b3d0c95d0f-kube-api-access-nbxlr\") pod \"community-operators-h46sz\" (UID: \"2fa1e7d7-4479-4381-8c9e-80b3d0c95d0f\") " pod="openshift-marketplace/community-operators-h46sz" Mar 19 16:13:44 crc kubenswrapper[4771]: I0319 16:13:44.521170 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fa1e7d7-4479-4381-8c9e-80b3d0c95d0f-utilities\") pod \"community-operators-h46sz\" (UID: \"2fa1e7d7-4479-4381-8c9e-80b3d0c95d0f\") " pod="openshift-marketplace/community-operators-h46sz" Mar 19 16:13:44 crc kubenswrapper[4771]: I0319 16:13:44.622901 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fa1e7d7-4479-4381-8c9e-80b3d0c95d0f-catalog-content\") pod \"community-operators-h46sz\" (UID: \"2fa1e7d7-4479-4381-8c9e-80b3d0c95d0f\") " pod="openshift-marketplace/community-operators-h46sz" Mar 19 16:13:44 crc kubenswrapper[4771]: I0319 16:13:44.623483 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fa1e7d7-4479-4381-8c9e-80b3d0c95d0f-catalog-content\") pod \"community-operators-h46sz\" (UID: \"2fa1e7d7-4479-4381-8c9e-80b3d0c95d0f\") " pod="openshift-marketplace/community-operators-h46sz" Mar 19 16:13:44 crc kubenswrapper[4771]: I0319 16:13:44.623655 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbxlr\" (UniqueName: \"kubernetes.io/projected/2fa1e7d7-4479-4381-8c9e-80b3d0c95d0f-kube-api-access-nbxlr\") pod \"community-operators-h46sz\" (UID: \"2fa1e7d7-4479-4381-8c9e-80b3d0c95d0f\") " pod="openshift-marketplace/community-operators-h46sz" Mar 19 16:13:44 crc kubenswrapper[4771]: I0319 16:13:44.624946 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fa1e7d7-4479-4381-8c9e-80b3d0c95d0f-utilities\") pod \"community-operators-h46sz\" (UID: \"2fa1e7d7-4479-4381-8c9e-80b3d0c95d0f\") " pod="openshift-marketplace/community-operators-h46sz" Mar 19 16:13:44 crc kubenswrapper[4771]: I0319 16:13:44.625251 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fa1e7d7-4479-4381-8c9e-80b3d0c95d0f-utilities\") pod \"community-operators-h46sz\" (UID: \"2fa1e7d7-4479-4381-8c9e-80b3d0c95d0f\") " pod="openshift-marketplace/community-operators-h46sz" Mar 19 16:13:44 crc kubenswrapper[4771]: I0319 16:13:44.653704 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbxlr\" (UniqueName: \"kubernetes.io/projected/2fa1e7d7-4479-4381-8c9e-80b3d0c95d0f-kube-api-access-nbxlr\") pod \"community-operators-h46sz\" (UID: \"2fa1e7d7-4479-4381-8c9e-80b3d0c95d0f\") " pod="openshift-marketplace/community-operators-h46sz" Mar 19 16:13:44 crc kubenswrapper[4771]: I0319 16:13:44.802105 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h46sz" Mar 19 16:13:45 crc kubenswrapper[4771]: I0319 16:13:45.423336 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h46sz"] Mar 19 16:13:45 crc kubenswrapper[4771]: I0319 16:13:45.889396 4771 generic.go:334] "Generic (PLEG): container finished" podID="2fa1e7d7-4479-4381-8c9e-80b3d0c95d0f" containerID="f431d14e5e165a36c59422f40500a1f8ae2b394e7bf26822417f68d55278f511" exitCode=0 Mar 19 16:13:45 crc kubenswrapper[4771]: I0319 16:13:45.889442 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h46sz" event={"ID":"2fa1e7d7-4479-4381-8c9e-80b3d0c95d0f","Type":"ContainerDied","Data":"f431d14e5e165a36c59422f40500a1f8ae2b394e7bf26822417f68d55278f511"} Mar 19 16:13:45 crc kubenswrapper[4771]: I0319 16:13:45.889469 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h46sz" event={"ID":"2fa1e7d7-4479-4381-8c9e-80b3d0c95d0f","Type":"ContainerStarted","Data":"608e8db2035033e87c678daf3df5bc5dc5e9e70fc998a38b6a45aec94f6d5e11"} Mar 19 16:13:45 crc kubenswrapper[4771]: I0319 16:13:45.891701 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 16:13:47 crc kubenswrapper[4771]: I0319 16:13:47.508857 4771 scope.go:117] "RemoveContainer" containerID="18b3dade220056fdd9e6f166aea4cbef4706d95a1a7ea0ecba6c0aa556af9980" Mar 19 16:13:47 crc kubenswrapper[4771]: E0319 16:13:47.509399 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(74c5f622-0ced-47f9-80d5-75a09acfafc0)\"" pod="openstack/rabbitmq-server-0" podUID="74c5f622-0ced-47f9-80d5-75a09acfafc0" Mar 19 16:13:50 crc kubenswrapper[4771]: I0319 16:13:50.938098 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h46sz" event={"ID":"2fa1e7d7-4479-4381-8c9e-80b3d0c95d0f","Type":"ContainerStarted","Data":"c207c235224b1ade1faa65f7b78650c5d35d128c69fb88b58fbb541cff5b2a9d"} Mar 19 16:13:51 crc kubenswrapper[4771]: I0319 16:13:51.956820 4771 generic.go:334] "Generic (PLEG): container finished" podID="2fa1e7d7-4479-4381-8c9e-80b3d0c95d0f" containerID="c207c235224b1ade1faa65f7b78650c5d35d128c69fb88b58fbb541cff5b2a9d" exitCode=0 Mar 19 16:13:51 crc kubenswrapper[4771]: I0319 16:13:51.957244 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h46sz" event={"ID":"2fa1e7d7-4479-4381-8c9e-80b3d0c95d0f","Type":"ContainerDied","Data":"c207c235224b1ade1faa65f7b78650c5d35d128c69fb88b58fbb541cff5b2a9d"} var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515157020312024441 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015157020313017357 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015157011105016501 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015157011105015451 5ustar corecore